CloudIngest

Azure Data Factory Engineer (W2 Only)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Azure Data Factory Engineer with a contract length of "unknown" and a pay rate of $45/hr (W2). Key skills include Azure Data Factory, PySpark, Databricks, Snowflake, and advanced data modeling. A Bachelor’s degree and 5+ years of data engineering experience are required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
360
-
🗓️ - Date
January 7, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Houston, TX
-
🧠 - Skills detailed
#Data Cleaning #Cloud #Snowflake #Computer Science #Compliance #Data Science #PySpark #AI (Artificial Intelligence) #ADF (Azure Data Factory) #Schema Design #Monitoring #SQL (Structured Query Language) #Storage #Data Processing #"ETL (Extract #Transform #Load)" #Azure Data Factory #Scala #ML (Machine Learning) #Spark (Apache Spark) #Data Modeling #Databricks #Data Governance #Azure SQL #Data Ingestion #Security #Data Engineering #TensorFlow #Azure #Datasets #Data Pipeline #Big Data #Data Warehouse
Role description
Please send relevant profiles to srikanth@cloudingest.com (W2 Only) Job-Role/Title: Data Engineer (ADF) Location: Irving, TX / Houston, TX / Atlanta, GA and Chicago, IL Rate Cap: $45/Hr. (on W2) max! Role Overview: We are seeking a highly skilled Data Engineer with deep expertise in modern data platforms and advanced analytics. The ideal candidate will design, build, and optimize data pipelines, ensuring scalable, reliable, and high-performance data solutions across cloud and big data ecosystems. This role requires strong hands-on experience with Azure Data Factory, PySpark, Databricks, Snowflake, and advanced data modeling techniques. Key Responsibilities Data Ingestion & Integration • Design and implement robust data ingestion pipelines using Azure Data Factory (ADF). • Apply ADF best practices for scalability, monitoring, and error handling. Data Processing & Optimization • Develop and optimize PySpark jobs, focusing on efficient joins, junctions, and skew mitigation. • Identify and resolve data skew issues to improve performance in distributed environments. • Implement schema drift handling strategies to ensure pipeline resilience. Data Modeling & Storage • Build and maintain star schema models for analytical workloads. • Work with Azure SQL and Snowflake to design scalable data warehouses. • Manage data cleaning and transformation processes for high-quality datasets. Advanced Analytics & AI Integration • Leverage TensorFlow and contextual/model embeddings to integrate machine learning models into data pipelines. • Implement bias detection and transformation techniques to ensure fairness in data-driven models. • Identify data clusters in Databricks without impacting performance, enabling advanced analytics. Collaboration & Governance • Partner with data scientists, analysts, and business stakeholders to deliver actionable insights. • Ensure compliance with data governance, security, and privacy standards. • Document processes, pipelines, and best practices for knowledge sharing. Required Skills & Expertise • Proven experience with Azure Data Factory (ADF), including ingestion pipelines and best practices. • Strong proficiency in Azure SQL, PySpark, and Databricks. • Expertise in Snowflake data warehousing and Star Schema design. • Hands-on experience with TensorFlow for embedding and model integration. • Deep understanding of Data Skew issues and optimization strategies. • Knowledge of Schema Drift handling and bias transformation techniques. • Strong background in Data Cleaning, Contextual Embedding, Model Embedding, and Advanced Analytics Integration. Qualifications • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field. • 5+ years of experience in data engineering roles with cloud and big data platforms. • Demonstrated ability to optimize large-scale distributed data systems. • Excellent problem-solving, and collaboration skills. • Must have excellent communication