Saxon Global

Data Engineer (Azure / Databricks)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Azure Data Engineer in Dallas, TX (Hybrid). The 12+ month contract requires 10+ years of data engineering experience, 3+ years with Azure, and expertise in ADF, Databricks, SQL, and Python. Preferred certifications include Azure Data Engineer Associate and Databricks Certified Data Engineer.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 10, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Dallas, TX
-
🧠 - Skills detailed
#SQL (Structured Query Language) #Data Pipeline #Python #Documentation #Data Warehouse #Databricks #Data Engineering #Version Control #ADF (Azure Data Factory) #Azure Data Factory #"ETL (Extract #Transform #Load)" #Azure Databricks #ADLS (Azure Data Lake Storage) #Data Transformations #GIT #Cloud #Azure cloud #Data Lake #Spark (Apache Spark) #Storage #Azure #Delta Lake #Scala #Azure ADLS (Azure Data Lake Storage) #PySpark
Role description
Job Title: Senior Azure Data Engineer Location: Dallas, TX (Hybrid – 3 days onsite per week) Duration: 12+ Months Contract Core Technical Skills • 10+ years of overall Data Engineering experience • 3+ years of hands-on experience with Azure Cloud • Strong experience with: • Azure Data Factory (ADF) – pipeline development & orchestration • Azure Databricks – Spark, Delta Lake, data transformations • SQL – advanced querying, optimization • Python / PySpark • Azure Data Lake Storage (ADLS) • Experience with: • CI/CD pipelines • Git version control • ETL / ELT processes • Data warehousing concepts Certifications (Preferred, Not Mandatory) • Microsoft Certified: Azure Data Engineer Associate • Databricks Certified Data Engineer (Associate or Professional) Key Responsibilities • Design, build, and maintain scalable Azure data pipelines • Develop and manage ETL/ELT processes using ADF and Databricks • Integrate data from multiple enterprise systems • Optimize data performance and storage • Build and support data warehouses for analytics and reporting • Monitor, troubleshoot, and resolve data pipeline issues • Maintain clear technical documentation • Stay updated with Azure data engineering best practices