

Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (Mid & Senior) on a 2–6 month contract, remote (U.S. time zones). Requires 5+ years with IICS, strong SQL (Redshift), and 3+ years with Airflow/dbt. Senior candidates need 7–10+ years and advanced skills.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
September 24, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Remote
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Airflow #Informatica Cloud #Cloud #Amazon Redshift #Logging #AWS S3 (Amazon Simple Storage Service) #Redshift #Python #Informatica #Macros #IAM (Identity and Access Management) #SQL (Structured Query Language) #JSON (JavaScript Object Notation) #Data Engineering #dbt (data build tool) #XML (eXtensible Markup Language) #Data Quality #S3 (Amazon Simple Storage Service) #IICS (Informatica Intelligent Cloud Services) #GitHub #Lambda (AWS Lambda) #Migration #AWS (Amazon Web Services) #Documentation
Role description
Data Engineer Contractors (Mid & Senior) — Confidential Fintech
Remote (U.S. time zones, Central overlap) | Start: ASAP | Term: 2–6 months (extendable)
Openings: 4
What you’ll do
• Migrate Informatica Cloud (IICS) TaskFlows to dbt (SQL/Jinja) + Airflow on Amazon Redshift (sources in S3).
• Reverse-engineer IICS via JSON/XML exports; build reusable macros/patterns and code stubs.
• Ship fast: ramp Weeks 1–2, then ~4 TaskFlows/week.
• Add data quality checks, logging, idempotency, and clear rollback/repair; document lineage & rules.
• Use LLM-assisted techniques to speed reverse-engineering and documentation.
Must-haves (Mid-Level)
• 5+ yrs with IICS; strong SQL (Redshift preferred) & Python.
• 3+ yrs Airflow & dbt; AWS (S3, Redshift, IAM, Lambda).
• Dimensional modeling; comfort using LLMs for productivity.
Must-haves (Senior-Level)
• 7–10+ yrs data engineering with IICS and legacy-to-code migrations.
• Expert SQL (Redshift); advanced Airflow/dbt; AWS (S3, Redshift, IAM, CloudWatch).
• Proven Jinja macros, repeatable patterns, scaffolding; rigorous validation for LLM-assisted workflows.
• Strong warehouse architecture & dimensional modeling.
How to apply
Send your resume (and GitHub if relevant) plus 2–3 bullets on recent IICS → dbt/Airflow migrations, including throughput and patterns you built.
Data Engineer Contractors (Mid & Senior) — Confidential Fintech
Remote (U.S. time zones, Central overlap) | Start: ASAP | Term: 2–6 months (extendable)
Openings: 4
What you’ll do
• Migrate Informatica Cloud (IICS) TaskFlows to dbt (SQL/Jinja) + Airflow on Amazon Redshift (sources in S3).
• Reverse-engineer IICS via JSON/XML exports; build reusable macros/patterns and code stubs.
• Ship fast: ramp Weeks 1–2, then ~4 TaskFlows/week.
• Add data quality checks, logging, idempotency, and clear rollback/repair; document lineage & rules.
• Use LLM-assisted techniques to speed reverse-engineering and documentation.
Must-haves (Mid-Level)
• 5+ yrs with IICS; strong SQL (Redshift preferred) & Python.
• 3+ yrs Airflow & dbt; AWS (S3, Redshift, IAM, Lambda).
• Dimensional modeling; comfort using LLMs for productivity.
Must-haves (Senior-Level)
• 7–10+ yrs data engineering with IICS and legacy-to-code migrations.
• Expert SQL (Redshift); advanced Airflow/dbt; AWS (S3, Redshift, IAM, CloudWatch).
• Proven Jinja macros, repeatable patterns, scaffolding; rigorous validation for LLM-assisted workflows.
• Strong warehouse architecture & dimensional modeling.
How to apply
Send your resume (and GitHub if relevant) plus 2–3 bullets on recent IICS → dbt/Airflow migrations, including throughput and patterns you built.