

ESB Technologies
Data Transformation Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Transformation Engineer in Dallas-Fort Worth, with a contract length of W2/1099. Requires 10+ years in data engineering, expertise in Snowflake, SQL, Python, and ETL tools, plus strong delivery and risk management skills.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 31, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
1099 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Dallas-Fort Worth Metroplex
-
🧠 - Skills detailed
#Data Pipeline #Consulting #dbt (data build tool) #"ETL (Extract #Transform #Load)" #Python #Data Modeling #SQL (Structured Query Language) #Leadership #Data Engineering #PySpark #Snowflake #Informatica #Airflow #ADF (Azure Data Factory) #Spark (Apache Spark)
Role description
Job: Data Transformation Engineer
Location: Dallas-Fort Worth
W2 / 1099 Only
Job Description:
Qualifications
• 10+ years of experience in data engineering and delivery leadership.
• Proven track record managing delivery of Snowflake-based data platforms (pipelines, ETL, semantic layers).
• Hands-on experience with:
• SQL, Python, PySpark
• Snowflake (data modeling, performance optimization, RBAC, Bronze/Silver/Gold architecture)
• DBT for transformations/configurations
• ETL/ELT tools (Airflow, Informatica, Glue, ADF, etc.)
• Strong background in data pipeline testing, validation, and quality frameworks.
• Experience working with consulting partners and distributed onshore/offshore engineering teams.
• Excellent delivery and risk management skills, able to foresee challenges, maintain RAID logs, and keep workstreams accountable.
Strong executive presence; comfortable reporting to AVPs, CIO, and executive steering committees.
Job: Data Transformation Engineer
Location: Dallas-Fort Worth
W2 / 1099 Only
Job Description:
Qualifications
• 10+ years of experience in data engineering and delivery leadership.
• Proven track record managing delivery of Snowflake-based data platforms (pipelines, ETL, semantic layers).
• Hands-on experience with:
• SQL, Python, PySpark
• Snowflake (data modeling, performance optimization, RBAC, Bronze/Silver/Gold architecture)
• DBT for transformations/configurations
• ETL/ELT tools (Airflow, Informatica, Glue, ADF, etc.)
• Strong background in data pipeline testing, validation, and quality frameworks.
• Experience working with consulting partners and distributed onshore/offshore engineering teams.
• Excellent delivery and risk management skills, able to foresee challenges, maintain RAID logs, and keep workstreams accountable.
Strong executive presence; comfortable reporting to AVPs, CIO, and executive steering committees.






