

Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer (Contract, 6-12 months) focused on Microsoft Fabric and Azure Cloud. Remote work, pay rate unspecified. Requires experience in Azure Data Services, Databricks, Python, SQL, and Power BI. Microsoft certifications preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
August 2, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Remote
-
📄 - Contract type
Fixed Term
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#BI (Business Intelligence) #Consulting #Infrastructure as Code (IaC) #SQL (Structured Query Language) #Data Modeling #Azure #Data Lake #Delta Lake #Data Pipeline #Terraform #Automation #DevOps #Kafka (Apache Kafka) #Python #Scala #"ETL (Extract #Transform #Load)" #Cloud #ADF (Azure Data Factory) #Data Engineering #Data Lineage #Microsoft Power BI #ML (Machine Learning) #Data Quality #Deployment #Programming #Synapse #Spark (Apache Spark) #Databricks #Azure cloud #Azure Data Factory
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Senior Data Engineer (Contract) – Microsoft Fabric & Azure Cloud
Location: Remote
Type: Contract (6-12 months, extension possible)
About the Opportunity
Are you passionate about building robust data platforms with the latest cloud technologies? We’re seeking a Senior Data Engineer to help us accelerate our data transformation journey using Microsoft Fabric, Azure Data Services, Databricks, and Power BI. This is a high-impact, hands-on role, ideal for consultants who thrive in dynamic environments and enjoy delivering enterprise-grade solutions.
Key Responsibilities
• Design & Build Scalable Data Pipelines:
• Architect and implement end-to-end data pipelines leveraging Azure Data Factory, Synapse Analytics, Databricks, and Microsoft Fabric following lakehouse/medallion architecture.
• Modern Data Modeling & Lakehouse Management:
• Develop and optimize data models, Delta Lake structures, and curated data sets to power analytics and reporting solutions.
• Enable Advanced Analytics & BI:
• Partner with analysts and business users to deliver high-value insights using Power BI and self-service analytics frameworks.
• Data Quality & Governance:
• Implement data quality checks, data lineage, and governance standards to ensure trusted data delivery.
• Performance & Cost Optimization:
• Fine-tune Spark/Databricks jobs and Azure resources for efficiency, scalability, and cost-effectiveness.
• Automation & DevOps:
• Build CI/CD workflows and automate data deployments using IaC tools (ARM, Terraform, etc.).
What You Bring
• Proven experience in building enterprise data solutions on Azure (ADF, Synapse, Data Lake, SQL) and Databricks.
• Strong command of Microsoft Fabric for complex pipeline development and orchestration.
• Advanced programming in Python and SQL; experience with Spark is a plus.
• Track record in implementing lakehouse/medallion architectures, ETL/ELT frameworks, and scalable data models.
• Ability to deliver business-ready dashboards and analytics with Power BI.
• Strong problem-solving skills and a collaborative approach in cross-functional teams.
• Previous contract/consulting experience is a strong advantage.
Nice to Have
• Experience with real-time streaming (Kafka, Event Hubs, Spark Streaming).
• Knowledge of MLOps or integration with machine learning workflows.
• Microsoft certifications (e.g., Azure Data Engineer, Fabric Data Engineer).
Why Work With Us?
• Challenging Projects: Be part of a next-generation data modernization initiative.
• Remote Flexibility: 100% remote, outcome-driven environment.
• Tech-Forward Stack: Work with the newest features of Microsoft Fabric, Azure, and Databricks.
• Impact: Directly shape business outcomes and modern data practices.
Ready to take your data engineering expertise to the next level?
Send us your Resume and a brief note about your recent project experience. We look forward to collaborating!