

KPG99 INC
W2 Only :: Data Engineer - ETL Migration
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer specializing in ETL Migration, requiring 3+ years of experience, proficiency in Python, SQL, and AWS services, with strong skills in Apache Airflow and cloud data warehouses like Snowflake. Contract length and pay rate are unspecified.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 11, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Data Engineering #"ETL (Extract #Transform #Load)" #Python #Aurora #Data Warehouse #RDS (Amazon Relational Database Service) #S3 (Amazon Simple Storage Service) #Migration #GIT #AWS (Amazon Web Services) #Snowflake #Datasets #Matillion #IAM (Identity and Access Management) #Data Pipeline #Airflow #Cloud #Apache Airflow #SQL (Structured Query Language)
Role description
Position Title: Data Engineer – ETL Migration (Matillion to Airflow)
Required Qualifications:
• 3+ years of experience in Data Engineering or ETL development.
• Strong hands-on experience with Apache Airflow or similar orchestration tools.
• Proficiency in Python for building data pipelines and workflows.
• Strong SQL skills with experience handling large datasets.
• Experience with AWS services such as S3, RDS/Aurora, and IAM.
• Experience with cloud data warehouses, preferably Snowflake.
• Proven experience building and maintaining ETL/ELT pipelines.
• Familiarity with Git and CI/CD workflows.
Position Title: Data Engineer – ETL Migration (Matillion to Airflow)
Required Qualifications:
• 3+ years of experience in Data Engineering or ETL development.
• Strong hands-on experience with Apache Airflow or similar orchestration tools.
• Proficiency in Python for building data pipelines and workflows.
• Strong SQL skills with experience handling large datasets.
• Experience with AWS services such as S3, RDS/Aurora, and IAM.
• Experience with cloud data warehouses, preferably Snowflake.
• Proven experience building and maintaining ETL/ELT pipelines.
• Familiarity with Git and CI/CD workflows.






