

Albano Systems, Inc.
Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of "contract to hire" at a pay rate of "$XX/hour". Required skills include 5+ years of data engineering experience, expertise in AWS and Snowflake, and familiarity with P&C Commercial Lines business.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
680
-
ποΈ - Date
November 29, 2025
π - Duration
Unknown
-
ποΈ - Location
Unknown
-
π - Contract
W2 Contractor
-
π - Security
Unknown
-
π - Location detailed
Charlotte, NC
-
π§ - Skills detailed
#Computer Science #SnowSQL #Cloud #Snowflake #Informatica #Lean #Shell Scripting #Python #"ETL (Extract #Transform #Load)" #Automated Testing #SQL (Structured Query Language) #Talend #Data Science #Data Pipeline #DataOps #Agile #Scripting #Jenkins #Batch #SnowPipe #GitHub #AWS (Amazon Web Services) #Hadoop #Big Data #Spark (Apache Spark) #Scala #Oracle #Data Engineering
Role description
W2 ONLY - NO CORP TO CORP - CONTRACT TO HIRE - NO VISA SPONSOR/TRANSFER - NO 3RD PARTY AGENCY CANDIDATES
Data Engineer
Serve as subject matter expert and/or technical lead for large-scale data products.
Drive end-to-end solution delivery across multiple platforms and technologies, leveraging ELT solutions to acquire, integrate, and operationalize data.
Partner with architects and stakeholders to define and implement pipeline and data product architecture, ensuring integrity and scalability.
Communicate risks and trade-offs of technology solutions to senior leaders, translating technical concepts for business audiences.
Build and enhance data pipelines using cloud-based architectures.
Design simplified data models for complex business problems.
Champion Data Engineering best practices across teams, implementing leading big data methodologies (AWS, Hadoop/EMR, Spark, Snowflake, Talend, Informatica) in hybrid cloud/on-prem environments.
Operate independently while fostering a collaborative, transformation-focused mindset.
Work effectively in a lean, fast-paced organization, leveraging Scaled Agile principles.
Promote code quality management, FinOps principles, automated testing, and environment management practices to deliver incremental customer value.
Qualifications
5+ years of data engineering experience.
2+ years developing and operating production workloads in cloud infrastructure.
Bachelorβs degree in Computer Science, Data Science, Information Technology, or related field.
Hands-on experience with Snowflake (including SnowSQL, Snowpipe).
Expert-level skills in AWS services, Snowflake, Python, Spark (certifications are a plus).
Proficiency in ETL tools such as Talend and Informatica.
Strong knowledge of Data Warehousing (modeling, mapping, batch and real-time pipelines).
Experience with DataOps tools (GitHub, Jenkins, UDeploy).
Familiarity with P&C Commercial Lines business.
Knowledge of legacy tech stack: Oracle Database, PL/SQL, Autosys, Hadoop, stored procedures, Shell scripting.
Experience using Agile tools like Rally.
Excellent written and verbal communication skills to interact effectively with technical and non-technical stakeholders.
W2 ONLY - NO CORP TO CORP - CONTRACT TO HIRE - NO VISA SPONSOR/TRANSFER - NO 3RD PARTY AGENCY CANDIDATES
Data Engineer
Serve as subject matter expert and/or technical lead for large-scale data products.
Drive end-to-end solution delivery across multiple platforms and technologies, leveraging ELT solutions to acquire, integrate, and operationalize data.
Partner with architects and stakeholders to define and implement pipeline and data product architecture, ensuring integrity and scalability.
Communicate risks and trade-offs of technology solutions to senior leaders, translating technical concepts for business audiences.
Build and enhance data pipelines using cloud-based architectures.
Design simplified data models for complex business problems.
Champion Data Engineering best practices across teams, implementing leading big data methodologies (AWS, Hadoop/EMR, Spark, Snowflake, Talend, Informatica) in hybrid cloud/on-prem environments.
Operate independently while fostering a collaborative, transformation-focused mindset.
Work effectively in a lean, fast-paced organization, leveraging Scaled Agile principles.
Promote code quality management, FinOps principles, automated testing, and environment management practices to deliver incremental customer value.
Qualifications
5+ years of data engineering experience.
2+ years developing and operating production workloads in cloud infrastructure.
Bachelorβs degree in Computer Science, Data Science, Information Technology, or related field.
Hands-on experience with Snowflake (including SnowSQL, Snowpipe).
Expert-level skills in AWS services, Snowflake, Python, Spark (certifications are a plus).
Proficiency in ETL tools such as Talend and Informatica.
Strong knowledge of Data Warehousing (modeling, mapping, batch and real-time pipelines).
Experience with DataOps tools (GitHub, Jenkins, UDeploy).
Familiarity with P&C Commercial Lines business.
Knowledge of legacy tech stack: Oracle Database, PL/SQL, Autosys, Hadoop, stored procedures, Shell scripting.
Experience using Agile tools like Rally.
Excellent written and verbal communication skills to interact effectively with technical and non-technical stakeholders.






