Tential Solutions

ETL Developer (Informatica / Snowflake / Azure Data Factory)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an ETL Developer (Informatica / Snowflake / Azure Data Factory) on a long-term remote contract. Requires 4+ years of ETL experience, proficiency in Informatica and SQL, and a Bachelor's degree. U.S. citizenship or Green Card required.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
April 5, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Remote
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Snowflake #Data Integration #Data Pipeline #Monitoring #Unix #IICS (Informatica Intelligent Cloud Services) #Cloud #Scala #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Batch #Azure #Documentation #Data Modeling #Informatica #Dimensional Data Models #Data Warehouse #Consulting #ADF (Azure Data Factory) #Oracle #Azure Data Factory #Data Architecture
Role description
ETL Developer (Informatica / Snowflake / Azure) Big 4 Consulting Group Remote (U.S. Based) | Long-Term Contract Role Summary We are seeking an experienced ETL Developer to design, develop, and support enterprise data integration solutions. In this client-facing role, you will focus on building and maintaining robust data warehouse pipelines using the Informatica suite (IDMC / PowerCenter / IICS), implementing dimensional data models, and delivering reliable batch/CDC processing. You will work across the full SDLC to translate complex business requirements into scalable technical solutions using Snowflake and Azure Data Factory. The Requirements β€’ Education: Bachelor’s Degree. β€’ Citizenship: Must be a US Citizen or Green Card Holder (No sponsorship available). β€’ Travel: Ability to travel within the U.S. based on client needs. Key Responsibilities β€’ Development: Build and maintain ETL solutions using Informatica IDMC, PowerCenter, and IICS. β€’ Data Modeling: Design and implement dimensional models, including FACT/DIM table design and Star/Snowflake schemas. β€’ Pipeline Orchestration: Design ETL mappings, workflows, and end-to-end scheduling logic using enterprise tools like Control-M. β€’ Cloud Integration: Build and support data pipelines using Snowflake and Azure Data Factory (ADF). β€’ Optimization: Write and tune complex SQL and PL/SQL for high-performance transformations and validations. β€’ Operational Excellence: Implement Change Data Capture (CDC) solutions and provide production support for scheduled batch jobs. β€’ Collaboration: Partner with data architects and stakeholders to ensure pipelines meet performance SLAs and support downstream analytics. Required Qualifications β€’ Experience: 4+ years of dedicated ETL development and maintenance for data warehousing. β€’ Informatica Expertise: Hands-on mastery of Informatica IDMC, PowerCenter, and/or IICS. β€’ Technical Skills: Strong proficiency in SQL/PL/SQL performance tuning and solid experience with UNIX and Oracle tools. β€’ Design: Deep understanding of ETL mapping design, workflow logic, and incremental load strategies. β€’ Delivery: Proven track record in production support, job monitoring, and issue resolution in batch environments. Preferred / Nice-to-Have β€’ Snowflake: Strong knowledge of data loading patterns and performance considerations. β€’ Azure: Experience building pipelines and orchestration within Azure Data Factory. β€’ CDC: Familiarity with log-based or incremental patterns for Change Data Capture. Success Measures β€’ Reliability: Stable, well-orchestrated workflows with minimal failures and fast recovery. β€’ Quality: High-quality data models that meet strict performance and analytics requirements. β€’ Efficiency: Improved pipeline performance through expert SQL tuning and transformation optimization. β€’ Clarity: Comprehensive documentation of mappings, workflows, and operational runbooks. #Remote