

Relanto
Lead Data Engineer (Informatica, DBT)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead Data Engineer (Informatica, DBT) on a contract basis, requiring expertise in Informatica, dbt, Snowflake, and Apache Airflow. The position involves data pipeline development and optimization, with a focus on data governance and compliance. Pay rate and location are unspecified.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
May 13, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Dallas, TX
-
🧠 - Skills detailed
#Version Control #GIT #Migration #Security #Data Pipeline #Data Architecture #Batch #Scala #Data Management #Informatica Cloud #"ETL (Extract #Transform #Load)" #dbt (data build tool) #Compliance #Data Governance #Cloud #Metadata #Apache Airflow #Informatica #Snowflake #BI (Business Intelligence) #Data Warehouse #SQL (Structured Query Language) #Data Quality #Datasets #Informatica PowerCenter #Data Catalog #Airflow #Data Engineering
Role description
Job Description:
Key Responsibilities
• Migrate existing Informatica workflows and SQL transformation logic into scalable and optimized dbt models.
• Design, develop, and maintain data pipelines using dbt, Snowflake, SQL, and Apache Airflow.
• Build and enhance enterprise data products to support analytics, reporting, and business intelligence requirements.
• Develop reusable, modular, and well-documented transformation frameworks in dbt.
• Perform end-to-end testing, validation, and reconciliation of migrated data pipelines and data products.
• Implement data governance standards, metadata management, and governance tagging across datasets and pipelines.
• Optimize Snowflake data models, queries, and warehouse performance for scalability and efficiency.
• Collaborate with cross-functional teams including Data Architects, Analysts, QA teams, and Business Stakeholders.
• Monitor and troubleshoot Airflow workflows and ensure reliable orchestration of batch data processes.
• Ensure adherence to data quality, security, compliance, and best engineering practices.
Required Skills & Qualifications
• Strong hands-on experience with Informatica PowerCenter or Informatica Cloud.
• Proven experience in dbt development and migration projects.
• Strong expertise in Snowflake data warehouse concepts and performance optimization.
• Advanced SQL development and query optimization skills.
• Experience working with Apache Airflow for workflow orchestration and scheduling.
• Strong understanding of ETL/ELT concepts, data warehousing, and dimensional modeling.
• Experience with data testing, validation, and reconciliation methodologies.
• Knowledge of data governance, metadata tagging, and data cataloging practices.
• Familiarity with CI/CD processes and version control tools such as Git.
• Excellent analytical, problem-solving, and communication skills.
Job Description:
Key Responsibilities
• Migrate existing Informatica workflows and SQL transformation logic into scalable and optimized dbt models.
• Design, develop, and maintain data pipelines using dbt, Snowflake, SQL, and Apache Airflow.
• Build and enhance enterprise data products to support analytics, reporting, and business intelligence requirements.
• Develop reusable, modular, and well-documented transformation frameworks in dbt.
• Perform end-to-end testing, validation, and reconciliation of migrated data pipelines and data products.
• Implement data governance standards, metadata management, and governance tagging across datasets and pipelines.
• Optimize Snowflake data models, queries, and warehouse performance for scalability and efficiency.
• Collaborate with cross-functional teams including Data Architects, Analysts, QA teams, and Business Stakeholders.
• Monitor and troubleshoot Airflow workflows and ensure reliable orchestration of batch data processes.
• Ensure adherence to data quality, security, compliance, and best engineering practices.
Required Skills & Qualifications
• Strong hands-on experience with Informatica PowerCenter or Informatica Cloud.
• Proven experience in dbt development and migration projects.
• Strong expertise in Snowflake data warehouse concepts and performance optimization.
• Advanced SQL development and query optimization skills.
• Experience working with Apache Airflow for workflow orchestration and scheduling.
• Strong understanding of ETL/ELT concepts, data warehousing, and dimensional modeling.
• Experience with data testing, validation, and reconciliation methodologies.
• Knowledge of data governance, metadata tagging, and data cataloging practices.
• Familiarity with CI/CD processes and version control tools such as Git.
• Excellent analytical, problem-solving, and communication skills.






