

Tential Solutions
ETL Developer (Informatica / Snowflake / Azure Data Factory)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an ETL Developer (Informatica/Snowflake/Azure Data Factory) on a long-term remote contract, offering competitive pay. Requires 4+ years of ETL experience, proficiency in Informatica, SQL/PL/SQL, and a Bachelor’s degree. U.S. citizenship required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 21, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Azure Data Factory #Data Warehouse #Scala #Snowflake #Cloud #Data Modeling #Monitoring #Batch #Consulting #Data Integration #Dimensional Data Models #Data Architecture #SQL (Structured Query Language) #Oracle #ADF (Azure Data Factory) #IICS (Informatica Intelligent Cloud Services) #Data Pipeline #Azure #Unix #Informatica #"ETL (Extract #Transform #Load)" #Documentation
Role description
ETL Developer (Informatica / Snowflake / Azure)
Big 4 Consulting Group Remote (U.S. Based) | Long-Term Contract
Role Summary
We are seeking an experienced ETL Developer to design, develop, and support enterprise data integration solutions. In this client-facing role, you will focus on building and maintaining robust data warehouse pipelines using the Informatica suite (IDMC / PowerCenter / IICS), implementing dimensional data models, and delivering reliable batch/CDC processing. You will work across the full SDLC to translate complex business requirements into scalable technical solutions using Snowflake and Azure Data Factory.
The Requirements
• Education: Bachelor’s Degree.
• Citizenship: Must be a US Citizen or Green Card Holder (No sponsorship available).
• Travel: Ability to travel within the U.S. based on client needs.
Key Responsibilities
• Development: Build and maintain ETL solutions using Informatica IDMC, PowerCenter, and IICS.
• Data Modeling: Design and implement dimensional models, including FACT/DIM table design and Star/Snowflake schemas.
• Pipeline Orchestration: Design ETL mappings, workflows, and end-to-end scheduling logic using enterprise tools like Control-M.
• Cloud Integration: Build and support data pipelines using Snowflake and Azure Data Factory (ADF).
• Optimization: Write and tune complex SQL and PL/SQL for high-performance transformations and validations.
• Operational Excellence: Implement Change Data Capture (CDC) solutions and provide production support for scheduled batch jobs.
• Collaboration: Partner with data architects and stakeholders to ensure pipelines meet performance SLAs and support downstream analytics.
Required Qualifications
• Experience: 4+ years of dedicated ETL development and maintenance for data warehousing.
• Informatica Expertise: Hands-on mastery of Informatica IDMC, PowerCenter, and/or IICS.
• Technical Skills: Strong proficiency in SQL/PL/SQL performance tuning and solid experience with UNIX and Oracle tools.
• Design: Deep understanding of ETL mapping design, workflow logic, and incremental load strategies.
• Delivery: Proven track record in production support, job monitoring, and issue resolution in batch environments.
Preferred / Nice-to-Have
• Snowflake: Strong knowledge of data loading patterns and performance considerations.
• Azure: Experience building pipelines and orchestration within Azure Data Factory.
• CDC: Familiarity with log-based or incremental patterns for Change Data Capture.
Success Measures
• Reliability: Stable, well-orchestrated workflows with minimal failures and fast recovery.
• Quality: High-quality data models that meet strict performance and analytics requirements.
• Efficiency: Improved pipeline performance through expert SQL tuning and transformation optimization.
• Clarity: Comprehensive documentation of mappings, workflows, and operational runbooks.
ETL Developer (Informatica / Snowflake / Azure)
Big 4 Consulting Group Remote (U.S. Based) | Long-Term Contract
Role Summary
We are seeking an experienced ETL Developer to design, develop, and support enterprise data integration solutions. In this client-facing role, you will focus on building and maintaining robust data warehouse pipelines using the Informatica suite (IDMC / PowerCenter / IICS), implementing dimensional data models, and delivering reliable batch/CDC processing. You will work across the full SDLC to translate complex business requirements into scalable technical solutions using Snowflake and Azure Data Factory.
The Requirements
• Education: Bachelor’s Degree.
• Citizenship: Must be a US Citizen or Green Card Holder (No sponsorship available).
• Travel: Ability to travel within the U.S. based on client needs.
Key Responsibilities
• Development: Build and maintain ETL solutions using Informatica IDMC, PowerCenter, and IICS.
• Data Modeling: Design and implement dimensional models, including FACT/DIM table design and Star/Snowflake schemas.
• Pipeline Orchestration: Design ETL mappings, workflows, and end-to-end scheduling logic using enterprise tools like Control-M.
• Cloud Integration: Build and support data pipelines using Snowflake and Azure Data Factory (ADF).
• Optimization: Write and tune complex SQL and PL/SQL for high-performance transformations and validations.
• Operational Excellence: Implement Change Data Capture (CDC) solutions and provide production support for scheduled batch jobs.
• Collaboration: Partner with data architects and stakeholders to ensure pipelines meet performance SLAs and support downstream analytics.
Required Qualifications
• Experience: 4+ years of dedicated ETL development and maintenance for data warehousing.
• Informatica Expertise: Hands-on mastery of Informatica IDMC, PowerCenter, and/or IICS.
• Technical Skills: Strong proficiency in SQL/PL/SQL performance tuning and solid experience with UNIX and Oracle tools.
• Design: Deep understanding of ETL mapping design, workflow logic, and incremental load strategies.
• Delivery: Proven track record in production support, job monitoring, and issue resolution in batch environments.
Preferred / Nice-to-Have
• Snowflake: Strong knowledge of data loading patterns and performance considerations.
• Azure: Experience building pipelines and orchestration within Azure Data Factory.
• CDC: Familiarity with log-based or incremental patterns for Change Data Capture.
Success Measures
• Reliability: Stable, well-orchestrated workflows with minimal failures and fast recovery.
• Quality: High-quality data models that meet strict performance and analytics requirements.
• Efficiency: Improved pipeline performance through expert SQL tuning and transformation optimization.
• Clarity: Comprehensive documentation of mappings, workflows, and operational runbooks.






