SynapOne

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer contract for 2 years, 100% remote, with a pay rate of "unknown." Requires 3+ years in Data Engineering, strong Apache Airflow and Python skills, and experience with ETL migrations, particularly from Matillion to Airflow.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
April 14, 2026
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
Remote
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#S3 (Amazon Simple Storage Service) #Deployment #"ETL (Extract #Transform #Load)" #SSIS (SQL Server Integration Services) #GIT #RDS (Amazon Relational Database Service) #Databases #AWS (Amazon Web Services) #Datasets #Migration #Data Engineering #Cloud #Docker #Informatica #Data Modeling #Data Warehouse #SQL (Structured Query Language) #Datadog #Apache Airflow #Python #Data Pipeline #Talend #Snowflake #Aurora #Monitoring #Matillion #Scala #IAM (Identity and Access Management) #Airflow #Kubernetes
Role description
Job Title: ETL Migration & Pipeline DevelopmentDuration 2‑Year Contract Location: 100% Remote Client SOMOS We are seeking a Data Engineer contractor (100% remote) to support and execute the migration of ETL pipelines from Matillion to Apache Airflow. This role focuses on rebuilding pipelines, improving reliability, and enabling scalable, code-based data workflows. This is a hands-on role requiring someone who can ramp quickly, work with existing ETL logic, and contribute within the first 1–2 weeks. Responsibilities: β€’ Migrate existing ETL pipelines from Matillion to Apache Airflow while preserving logic and dependencies β€’ Develop and maintain Airflow DAGs with proper scheduling, retries, and failure handling β€’ Reverse engineer existing Matillion jobs and translate them into Python-based workflows β€’ Build and optimize data pipelines across systems such as S3, Snowflake, and relational databases β€’ Perform data validation and reconciliation between source and target systems β€’ Write and optimize SQL transformations for large-scale datasets β€’ Implement monitoring, alerting, and error handling for pipelines β€’ Collaborate with data, platform, and analytics teams to ensure smooth migration and deployment β€’ Document pipelines, workflows, and operational processes Required Qualifications: β€’ 3+ years of experience in Data Engineering or ETL development β€’ Strong hands-on experience with Apache Airflow or similar orchestration tools β€’ Proficiency in Python for building data pipelines and workflows β€’ Strong SQL skills and experience working with large datasets β€’ Experience with cloud platforms (AWS preferred: S3, RDS/Aurora, IAM) β€’ Experience with cloud data warehouses (Snowflake preferred or similar) β€’ Experience building and maintaining ETL/ELT pipelines β€’ Familiarity with Git and CI/CD workflows Preferred Qualifications: β€’ Experience migrating ETL workflows from tools like Matillion, Informatica, Talend, or SSIS to Airflow β€’ Experience with Airflow in containerized environments (Docker, Kubernetes/EKS) β€’ Familiarity with data validation, reconciliation, and pipeline testing strategies β€’ Experience with monitoring tools such as Datadog or CloudWatch β€’ Understanding of data modeling or medallion architecture