

KPG99 INC
Snowflake Data Engineer with Python/Apache/AWS
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake Data Engineer with expertise in Python, Apache Airflow, and AWS. It is a 6+ month remote contract, requiring strong skills in data migration, ETL processes, and cloud architecture. Local candidates to Chicago preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
May 1, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Chicago, IL
-
🧠 - Skills detailed
#S3 (Amazon Simple Storage Service) #"ETL (Extract #Transform #Load)" #AWS (Amazon Web Services) #Apache Airflow #Batch #Migration #RDS (Amazon Relational Database Service) #SnowPipe #Snowflake #Cloud #API (Application Programming Interface) #Data Warehouse #Data Pipeline #Streamlit #Lambda (AWS Lambda) #Data Engineering #Airflow #Python
Role description
Please find below the Job description and share your latest resume if you are interested?
Position: Snowflake Data Engineer with Python/Apache/AWS
Location: 100% Remote (Need Local to Chicago) or ex Hyatt
Visa Status: Independent Consultant
Duration: 6+ Months (Will be a Long Term Contract or CTH)
PREFERRED TO WORK ON W2 OR 1099
Share your LinkedIn profile, 2-3 professional references, and date of birth (DOB) for Quick Consideration.
Must Have:
10/10 Python
Snowflake
Airflow Apache Airflow is an open-source platform used to programmatically author, schedule, and monitor complex data pipelines and workflows using Python.
AWS / S3 and Lambda
Snowflake UI
Extract data from Data store and load in AWS. Will be structured and unstructured Data. ETL tools used for reporting. Different tables so downstream apps can consume using Python.
Position Responsibilities:
Implement advanced Snowflake capabilities (Streams, Tasks, Snowpipe, data sharing) for real-time and batch processing
Lead migration initiatives from legacy data warehouses to Snowflake with minimal disruption
Design and develop data applications and solutions on Snowflake, including Streamlit apps and Snowflake Native Apps
Hands on experience with AWS cloud architecture and development using AWS resources like S3, Lambda, API Gateway, RDS, etc
Thanks and Regards
Karan Rajput | US IT Recruiter
Desk: 609-973-8207 || KRajput@kpgtech.com
Please find below the Job description and share your latest resume if you are interested?
Position: Snowflake Data Engineer with Python/Apache/AWS
Location: 100% Remote (Need Local to Chicago) or ex Hyatt
Visa Status: Independent Consultant
Duration: 6+ Months (Will be a Long Term Contract or CTH)
PREFERRED TO WORK ON W2 OR 1099
Share your LinkedIn profile, 2-3 professional references, and date of birth (DOB) for Quick Consideration.
Must Have:
10/10 Python
Snowflake
Airflow Apache Airflow is an open-source platform used to programmatically author, schedule, and monitor complex data pipelines and workflows using Python.
AWS / S3 and Lambda
Snowflake UI
Extract data from Data store and load in AWS. Will be structured and unstructured Data. ETL tools used for reporting. Different tables so downstream apps can consume using Python.
Position Responsibilities:
Implement advanced Snowflake capabilities (Streams, Tasks, Snowpipe, data sharing) for real-time and batch processing
Lead migration initiatives from legacy data warehouses to Snowflake with minimal disruption
Design and develop data applications and solutions on Snowflake, including Streamlit apps and Snowflake Native Apps
Hands on experience with AWS cloud architecture and development using AWS resources like S3, Lambda, API Gateway, RDS, etc
Thanks and Regards
Karan Rajput | US IT Recruiter
Desk: 609-973-8207 || KRajput@kpgtech.com





