

TEK NINJAS
Azure Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Azure Data Engineer on a 12-month contract, hybrid location (2-3 days onsite), offering competitive pay. Requires 10+ years of data engineering experience, including 5+ years in Azure, with expertise in ADF, Databricks, and financial or retail data systems.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 13, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Texas, United States
-
🧠 - Skills detailed
#Data Governance #Compliance #Sales Forecasting #ADLS (Azure Data Lake Storage) #GIT #Data Engineering #PySpark #PCI (Payment Card Industry) #Data Modeling #GDPR (General Data Protection Regulation) #Azure Data Factory #Scala #Forecasting #Synapse #Delta Lake #Data Transformations #Data Lake #Data Lakehouse #Data Ingestion #Spark SQL #Snowflake #ADF (Azure Data Factory) #Microsoft Power BI #Data Integrity #SQL (Structured Query Language) #Azure #DevOps #"ETL (Extract #Transform #Load)" #Python #Datasets #Azure DevOps #Data Pipeline #Security #Spark (Apache Spark) #Monitoring #BI (Business Intelligence) #Kafka (Apache Kafka) #Azure Databricks #Databricks
Role description
Role: Senior Azure Data Engineer
Location: Hybrid – Onsite 2–3 days/week
Type: 12-Month Contract / Full-Time
Experience: 10+ Years Total | 5+ in Azure Data Engineering
Key Responsibilities
• Design, develop, and optimize data ingestion, transformation, and integration pipelines using Azure Data Factory, Databricks, and Synapse Analytics.
• Build data lakehouse architectures leveraging Azure Data Lake (ADLS Gen2) and Delta Lake for large-scale banking, financial, and retail data systems.
• Develop and maintain ETL/ELT pipelines in PySpark, SQL, and Python, ensuring performance, scalability, and data integrity.
• Implement data models and pipelines to support risk analytics, transaction monitoring, sales forecasting, and customer insights.
• Collaborate with business teams and BI developers to deliver data solutions for Power BI and reporting dashboards.
• Ensure data governance, lineage tracking, and security compliance (GDPR, PCI-DSS, SOX).
• Deploy and automate workflows using Azure DevOps, Git, and CI/CD pipelines.
Required Skills
• 10+ years of experience in data engineering, including 5+ years on Azure.
• Strong expertise in Azure Data Factory (ADF), Azure Databricks, Synapse Analytics, and ADLS Gen2.
• Proficiency in SQL, Python, and PySpark for large-scale data transformations.
• Deep understanding of data modeling, warehousing, and lakehouse patterns.
• Experience with banking and financial datasets (transactions, risk, compliance) or retail data pipelines (sales, POS, customer analytics).
• Familiarity with Azure Monitor, Log Analytics, and data governance frameworks.
Preferred Skills
• Experience with Power BI, Snowflake, or Data Mesh architectures.
• Understanding of real-time data streaming (Kafka, Event Hub).
• Certification: Microsoft DP-203 (Azure Data Engineer Associate) preferred.
Role: Senior Azure Data Engineer
Location: Hybrid – Onsite 2–3 days/week
Type: 12-Month Contract / Full-Time
Experience: 10+ Years Total | 5+ in Azure Data Engineering
Key Responsibilities
• Design, develop, and optimize data ingestion, transformation, and integration pipelines using Azure Data Factory, Databricks, and Synapse Analytics.
• Build data lakehouse architectures leveraging Azure Data Lake (ADLS Gen2) and Delta Lake for large-scale banking, financial, and retail data systems.
• Develop and maintain ETL/ELT pipelines in PySpark, SQL, and Python, ensuring performance, scalability, and data integrity.
• Implement data models and pipelines to support risk analytics, transaction monitoring, sales forecasting, and customer insights.
• Collaborate with business teams and BI developers to deliver data solutions for Power BI and reporting dashboards.
• Ensure data governance, lineage tracking, and security compliance (GDPR, PCI-DSS, SOX).
• Deploy and automate workflows using Azure DevOps, Git, and CI/CD pipelines.
Required Skills
• 10+ years of experience in data engineering, including 5+ years on Azure.
• Strong expertise in Azure Data Factory (ADF), Azure Databricks, Synapse Analytics, and ADLS Gen2.
• Proficiency in SQL, Python, and PySpark for large-scale data transformations.
• Deep understanding of data modeling, warehousing, and lakehouse patterns.
• Experience with banking and financial datasets (transactions, risk, compliance) or retail data pipelines (sales, POS, customer analytics).
• Familiarity with Azure Monitor, Log Analytics, and data governance frameworks.
Preferred Skills
• Experience with Power BI, Snowflake, or Data Mesh architectures.
• Understanding of real-time data streaming (Kafka, Event Hub).
• Certification: Microsoft DP-203 (Azure Data Engineer Associate) preferred.






