

Sr Data Engineer (SnowFlake & AWS) / W2 ONLY / Remote
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr Data Engineer (SnowFlake & AWS), remote, contract-to-hire, W2 only, lasting over 6 months. Requires 7+ years in Data Engineering, expertise in Snowflake, Python, SQL, AWS services, and ETL/ELT processes.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 19, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Charlotte, NC
-
π§ - Skills detailed
#SQL (Structured Query Language) #Azure #Lambda (AWS Lambda) #Data Warehouse #Data Integrity #Automation #Python #Fivetran #Scala #"ETL (Extract #Transform #Load)" #S3 (Amazon Simple Storage Service) #dbt (data build tool) #Azure DevOps #Data Ingestion #Deployment #Data Quality #DevOps #Data Engineering #Snowflake #SNS (Simple Notification Service) #AWS Lambda #AWS (Amazon Web Services) #Data Pipeline #Data Extraction #Airflow #Cloud
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Sr Data Engineer (SnowFlake & AWS) / Remote / Contract-to-Hire / W2 ONLY
About our Customer:
Our DIRECT customer, a global leader in Food services industry is seeking an experienced βSr Data Engineerβ to work remotely.
This is a very long term Contract position.
About Sr. Data Engineer:
Senior Data Engineer is a hand-on role with expertise in developing data ingestion pipelines. This role is crucial in designing, building, and maintaining our data infrastructure, focusing on creating scalable pipelines, ensuring data integrity, and optimizing performance.
Key skills include strong Snowflake, Airflow, Python, SnowFlake Administration, Performance Tuning, Data Validation expertise, advanced SQL proficiency, data extraction from APIs using Python and AWS Lambda, and experience with ETL/ELT processes. Workflow automation using AWS Airflow is essential.
Responsibilities:
β’ Design, build, test, and implement scalable data pipelines using Python and SQL.
β’ Maintain and optimize our Snowflake data warehouseβs performance, including data ingestion and query optimization.
β’ Extract data from APIs using Python and AWS Lambda and automate workflows with AWS Airflow.
β’ Perform analysis and critical thinking to troubleshoot data-related issues and implement checks/scripts to enhance data quality.
β’ Collaborate with other data engineers and architect to develop new pipelines and/or optimize existing ones.
β’ Maintain code via CI/CD processes as defined in our Azure DevOps platform.
Qualifications:
β’ 7+ years of experience in Data Engineering roles, with a focus on building and implementing scalable data pipelines for data ingestion.
β’ Expertise in Snowflake, including data ingestion and performance optimization.
β’ Strong SQL skills for writing efficient queries and optimizing existing ones.
β’ Proficiency in Python for data extraction from APIs using AWS Lambda, Glue, etc.
β’ Experience with AWS services such as Lambda, Airflow, Glue, S3, SNS, etc.
β’ Highly self-motivated and detail-oriented with strong communication skills.
β’ Familiarity with ETL/ELT processes.
β’ Familiarity with cloud development / deployment (AWS preferred)
β’ Experience with Fivetran and DBT is a plus.