

Senior Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer on a 6-month remote contract, paying $70-$80 per hour. Requires 5+ years with Informatica Cloud, advanced SQL, Python, and 3+ years with Airflow and dbt. AWS experience is essential.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
640
-
ποΈ - Date discovered
September 24, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
1099 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Programming #Airflow #"ETL (Extract #Transform #Load)" #AI (Artificial Intelligence) #Informatica Cloud #Cloud #Amazon Redshift #Logging #Data Pipeline #Redshift #Python #Informatica #IAM (Identity and Access Management) #SQL (Structured Query Language) #JSON (JavaScript Object Notation) #Data Engineering #dbt (data build tool) #XML (eXtensible Markup Language) #Data Quality #S3 (Amazon Simple Storage Service) #IICS (Informatica Intelligent Cloud Services) #Metadata #Lambda (AWS Lambda) #Migration #AWS (Amazon Web Services) #Documentation
Role description
Job Title: Senior Data Engineer
Location: USA (Remote)
Pay Rate: $70-$80 an hour.. 6 Month Contract
We are seeking experienced Data Engineers to join a fast-paced, data modernization initiative at a growing technology company. The team is migrating legacy pipelines from Informatica Cloud (IICS) to a modern ELT stack using dbt and Airflow on Amazon Redshift, with source data in S3. This is a hands-on role requiring both technical expertise and the ability to work independently after an initial ramp-up period.
What Youβll Do:
β’ Migrate Informatica Cloud TaskFlows to dbt (SQL/Jinja) models and Airflow DAGs targeting Redshift.
β’ Reverse engineer Informatica logic by exporting JSON/XML TaskFlow metadata and translating transformations to dbt; leverage AI/LLM-assisted techniques to accelerate reverse engineering and documentation where appropriate.
β’ Deliver to an expected weekly cadence, ramping from paired programming onboarding to mostly independent migration of multiple TaskFlows per week.
β’ Harden pipelines with data quality checks, idempotent re-runs, logging, and clear rollback/repair procedures aligned to Data Ops best practices.
β’ Document lineage, business rules, and model semantics to support downstream analytics and maintain long-term maintainability.
Required Qualifications:
β’ 5+ years hands-on experience with Informatica Cloud (IICS), including exporting/interpreting JSON/XML artifacts.
β’ Advanced SQL (preferably Redshift dialect) and Python for ELT utilities and Airflow operators.
β’ 3+ years building and operating Airflow and dbt in production.
β’ Experience with AWS services such as S3, Redshift, IAM, and Lambda.
β’ Strong grasp of dimensional modeling and ability to translate complex business rules into warehouse models.
β’ Practical experience using AI/LLMs to accelerate reverse engineering and documentation (prompt crafting and validation).
Preferred Attributes:
β’ Ability to work full-time hours aligned with Central Time (U.S.).
β’ Comfortable operating independently in a high-velocity environment.
β’ Strong problem-solving skills and attention to detail in ETL and data pipeline design.
This is a remote U.S.-based opportunity with immediate start. Successful contractors who demonstrate high delivery velocity may have opportunities for longer-term engagement or conversion to full-time roles.
Job Title: Senior Data Engineer
Location: USA (Remote)
Pay Rate: $70-$80 an hour.. 6 Month Contract
We are seeking experienced Data Engineers to join a fast-paced, data modernization initiative at a growing technology company. The team is migrating legacy pipelines from Informatica Cloud (IICS) to a modern ELT stack using dbt and Airflow on Amazon Redshift, with source data in S3. This is a hands-on role requiring both technical expertise and the ability to work independently after an initial ramp-up period.
What Youβll Do:
β’ Migrate Informatica Cloud TaskFlows to dbt (SQL/Jinja) models and Airflow DAGs targeting Redshift.
β’ Reverse engineer Informatica logic by exporting JSON/XML TaskFlow metadata and translating transformations to dbt; leverage AI/LLM-assisted techniques to accelerate reverse engineering and documentation where appropriate.
β’ Deliver to an expected weekly cadence, ramping from paired programming onboarding to mostly independent migration of multiple TaskFlows per week.
β’ Harden pipelines with data quality checks, idempotent re-runs, logging, and clear rollback/repair procedures aligned to Data Ops best practices.
β’ Document lineage, business rules, and model semantics to support downstream analytics and maintain long-term maintainability.
Required Qualifications:
β’ 5+ years hands-on experience with Informatica Cloud (IICS), including exporting/interpreting JSON/XML artifacts.
β’ Advanced SQL (preferably Redshift dialect) and Python for ELT utilities and Airflow operators.
β’ 3+ years building and operating Airflow and dbt in production.
β’ Experience with AWS services such as S3, Redshift, IAM, and Lambda.
β’ Strong grasp of dimensional modeling and ability to translate complex business rules into warehouse models.
β’ Practical experience using AI/LLMs to accelerate reverse engineering and documentation (prompt crafting and validation).
Preferred Attributes:
β’ Ability to work full-time hours aligned with Central Time (U.S.).
β’ Comfortable operating independently in a high-velocity environment.
β’ Strong problem-solving skills and attention to detail in ETL and data pipeline design.
This is a remote U.S.-based opportunity with immediate start. Successful contractors who demonstrate high delivery velocity may have opportunities for longer-term engagement or conversion to full-time roles.