

Techvy Corp
ETL Engineer (Data Migration & Validation)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an ETL Engineer (Data Migration & Validation) in Phoenix, AZ / Rockville, MD, with a hybrid work model. The contract length is unspecified, offering a competitive pay rate. Key skills include SQL, Python, and Snowflake, with 4–8 years of ETL experience required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 12, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Phoenix, AZ
-
🧠 - Skills detailed
#Azure Data Factory #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Batch #ADF (Azure Data Factory) #Airflow #Snowflake #Debugging #Data Profiling #Data Migration #Migration #Azure #Data Quality #Data Pipeline #Python #Automation #Informatica #Spark (Apache Spark) #Documentation
Role description
ETL Engineer (Data Migration & Validation)
Location: Phoenix, AZ / Rockville, MD
🏢 Mode: Hybrid – Day 1 onsite (2–3 days per week)
We are looking for a hands-on ETL Engineer with deep expertise in data migration, validation, and transformationto support large-scale migration of legacy warehouse systems to Snowflake.
✅ Certifications in Snowflake or Azure are a big plus — please include certification links if applicable.
✅ Preference for candidates recently laid off from top-tier companies (Meta, Google, Apple, Microsoft, etc.).
🔧 Key Responsibilities:
• Design and build ETL/ELT pipelines to migrate data from Fiserv, CardWorks, PCFS, HR, and Finance systems.
• Perform data profiling, validation, and reconciliation between source and Snowflake targets.
• Build ingestion frameworks (batch, incremental, CDC) using Azure Data Factory or Airflow.
• Support historical loads and incremental refreshes.
• Implement data quality checks, audit logs, and error handling.
• Collaborate on Bronze–Silver–Gold data layer architecture.
• Optimize pipelines for performance and cost.
• Maintain detailed pipeline documentation and validation scripts.
💼 Required Skills & Experience:
• 4–8 years in ETL development (SQL, Python, Spark, Azure Data Factory, or Informatica).
• Strong background in data migration and validation to Snowflake (or similar).
• Hands-on SQL debugging and performance tuning expertise.
• Familiarity with CI/CD automation for data pipelines is a plus.
• Excellent analytical and teamwork skills.
ETL Engineer (Data Migration & Validation)
Location: Phoenix, AZ / Rockville, MD
🏢 Mode: Hybrid – Day 1 onsite (2–3 days per week)
We are looking for a hands-on ETL Engineer with deep expertise in data migration, validation, and transformationto support large-scale migration of legacy warehouse systems to Snowflake.
✅ Certifications in Snowflake or Azure are a big plus — please include certification links if applicable.
✅ Preference for candidates recently laid off from top-tier companies (Meta, Google, Apple, Microsoft, etc.).
🔧 Key Responsibilities:
• Design and build ETL/ELT pipelines to migrate data from Fiserv, CardWorks, PCFS, HR, and Finance systems.
• Perform data profiling, validation, and reconciliation between source and Snowflake targets.
• Build ingestion frameworks (batch, incremental, CDC) using Azure Data Factory or Airflow.
• Support historical loads and incremental refreshes.
• Implement data quality checks, audit logs, and error handling.
• Collaborate on Bronze–Silver–Gold data layer architecture.
• Optimize pipelines for performance and cost.
• Maintain detailed pipeline documentation and validation scripts.
💼 Required Skills & Experience:
• 4–8 years in ETL development (SQL, Python, Spark, Azure Data Factory, or Informatica).
• Strong background in data migration and validation to Snowflake (or similar).
• Hands-on SQL debugging and performance tuning expertise.
• Familiarity with CI/CD automation for data pipelines is a plus.
• Excellent analytical and teamwork skills.





