

Jobs via Dice
Senior Data Engineer - Python, AWS, Snowflake, Airflow | Contract | Remote
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with strong Python, AWS, Snowflake, and Apache Airflow skills. It is a 6-month remote contract with a pay rate of $70-$80/hr. Experience in ETL processes and handling structured/unstructured data is required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
560
-
🗓️ - Date
May 6, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Monitoring #Data Quality #Scripting #Scala #Snowflake #Apache Airflow #AWS S3 (Amazon Simple Storage Service) #Airflow #S3 (Amazon Simple Storage Service) #Data Engineering #"ETL (Extract #Transform #Load)" #Data Pipeline #Visualization #Python #AWS (Amazon Web Services) #Kafka (Apache Kafka) #Data Access #Cloud #Lambda (AWS Lambda)
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Alpha Business Solutions LLC, is seeking the following. Apply via Dice today!
Job Title: Senior Data Engineer (Python, AWS, Snowflake, Airflow)
Location: Remote (Preference for Chicago area candidates)
Duration 6 months plus
Start Date: Mid-May
Interview Process: 2 rounds
Pay rate: $70/HR - $80/HR W2 (negotiable)
Overview
We are seeking a Senior Data Engineer to design and build scalable data pipelines supporting enterprise data initiatives. This role will focus on extracting, transforming, and loading both structured and unstructured data into a modern cloud data platform, primarily leveraging AWS and Snowflake.
You will play a key role in building end-to-end data workflows, orchestrating pipelines, and enabling downstream data consumption through optimized data structures and user-facing interfaces.
Key Responsibilities
• Design and develop end-to-end data pipelines using Python
• Extract data and documents from enterprise data stores using connectors
• Load and manage data in AWS (S3) and Snowflake
• Process both structured and unstructured data for downstream consumption
• Build and maintain data models and tables for business applications
• Orchestrate workflows using Apache Airflow for scheduling and monitoring
• Collaborate on building UI/data access layers on top of Snowflake
• Ensure data quality, scalability, and performance across pipelines
• Support cross-functional teams with data and reporting needs
Required Skills
• Strong experience in Python (data pipeline development, scripting)
• Hands-on experience with AWS services (especially S3, Lambda)
• Expertise in Snowflake (data loading, modeling, performance tuning)
• Experience with Apache Airflow for orchestration and scheduling
• Solid understanding of ETL/ELT processes
• Experience handling both structured and unstructured data
Nice-to-Have Skills
• Experience building UI or data access layers on top of Snowflake
• Familiarity with data visualization/reporting tools
• Knowledge of Kafka (not required but beneficial for other projects)
• Experience with traditional ETL tools
Project Overview
• Build a cloud-based data pipeline using Python
• Load data into AWS S3, then into Snowflake
• Transform raw data into structured formats for business use
• Orchestrate the entire workflow using Airflow
• Enable end-user access via UI on top of Snowflake
Additional Notes
• Remote candidates are acceptable; local candidates near Chicago are a plus
• Interviews will begin shortly, with onboarding targeted for mid-May
Please apply with your interest. You may also reach out to me at
Thank you,
Ashu
We provide a comprehensive package which includes.
Benefits
• Medical for full time employees
• Dental, and Vision Insurance
• Life Insurance, Short-Term Disability, Long-Term Disability, etc.
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Alpha Business Solutions LLC, is seeking the following. Apply via Dice today!
Job Title: Senior Data Engineer (Python, AWS, Snowflake, Airflow)
Location: Remote (Preference for Chicago area candidates)
Duration 6 months plus
Start Date: Mid-May
Interview Process: 2 rounds
Pay rate: $70/HR - $80/HR W2 (negotiable)
Overview
We are seeking a Senior Data Engineer to design and build scalable data pipelines supporting enterprise data initiatives. This role will focus on extracting, transforming, and loading both structured and unstructured data into a modern cloud data platform, primarily leveraging AWS and Snowflake.
You will play a key role in building end-to-end data workflows, orchestrating pipelines, and enabling downstream data consumption through optimized data structures and user-facing interfaces.
Key Responsibilities
• Design and develop end-to-end data pipelines using Python
• Extract data and documents from enterprise data stores using connectors
• Load and manage data in AWS (S3) and Snowflake
• Process both structured and unstructured data for downstream consumption
• Build and maintain data models and tables for business applications
• Orchestrate workflows using Apache Airflow for scheduling and monitoring
• Collaborate on building UI/data access layers on top of Snowflake
• Ensure data quality, scalability, and performance across pipelines
• Support cross-functional teams with data and reporting needs
Required Skills
• Strong experience in Python (data pipeline development, scripting)
• Hands-on experience with AWS services (especially S3, Lambda)
• Expertise in Snowflake (data loading, modeling, performance tuning)
• Experience with Apache Airflow for orchestration and scheduling
• Solid understanding of ETL/ELT processes
• Experience handling both structured and unstructured data
Nice-to-Have Skills
• Experience building UI or data access layers on top of Snowflake
• Familiarity with data visualization/reporting tools
• Knowledge of Kafka (not required but beneficial for other projects)
• Experience with traditional ETL tools
Project Overview
• Build a cloud-based data pipeline using Python
• Load data into AWS S3, then into Snowflake
• Transform raw data into structured formats for business use
• Orchestrate the entire workflow using Airflow
• Enable end-user access via UI on top of Snowflake
Additional Notes
• Remote candidates are acceptable; local candidates near Chicago are a plus
• Interviews will begin shortly, with onboarding targeted for mid-May
Please apply with your interest. You may also reach out to me at
Thank you,
Ashu
We provide a comprehensive package which includes.
Benefits
• Medical for full time employees
• Dental, and Vision Insurance
• Life Insurance, Short-Term Disability, Long-Term Disability, etc.






