

Data Engineer - Azure / DataBricks
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an SC Cleared Azure Data Engineer specializing in Databricks, offering £500-550/day for a 3-month contract (hybrid, London). Key skills include Azure, Databricks, Python, SQL, and experience with data pipelines and CI/CD practices.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
550
-
🗓️ - Date discovered
August 28, 2025
🕒 - Project duration
3 to 6 months
-
🏝️ - Location type
Hybrid
-
📄 - Contract type
Outside IR35
-
🔒 - Security clearance
Yes
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#DevOps #Delta Lake #AI (Artificial Intelligence) #Azure DevOps #Azure Databricks #Azure Data Factory #ADF (Azure Data Factory) #Azure #Data Lake #ML (Machine Learning) #SQL (Structured Query Language) #Synapse #Databricks #Data Engineering #Azure Synapse Analytics #Data Science #Data Pipeline #Data Processing #Deployment #Python
Role description
This is an exciting contract opportunity for an SC Cleared Azure Data Engineer with a strong focus on Databricks to join an experienced team in a new customer engagement working at the forefront of data analytics and AI. This role offers the chance to take a key role in the design and delivery of advanced Databricks solutions within the Azure ecosystem.
Responsibilities:
• Design, build, and optimise end-to-end data pipelines using Azure Databricks, including Delta Live Tables.
• Collaborate with stakeholders to define technical requirements and propose Databricks-based solutions.
• Drive best practices for data engineering.
• Help clients realise the potential of data science, machine learning, and scaled data processing within Azure / Databricks ecosystem.
• Mentor junior engineers and support their personal development.
• Take ownership for the delivery of core solution components.
• Support with planning, requirements refinements, and work estimation.
Skills & Experiences:
• Proven experience designing and implementing data solutions in Azure using Databricks as a core platform.
• Hands-on expertise in Delta Lake, Delta Live Tables and Databricks Workflows.
• Strong coding skills in Python and SQL, with experience in developing modular, reusable code in Databricks.
• Deep understanding of lakehouse architecture, with a solid grasp of data warehousing, data lakes, and real-time data processing.
• Familiar with Azure Data Factory, Azure Synapse Analytics, and Microsoft Fabric.
• Good experience with CI/CD practices and tools for data platforms using Azure DevOps.
• Good knowledge on how to leverage AI to increase deployment productivity and quality.
• Excellent communication skills.
• Desirable: certification in Databricks Data Engineer and/or Azure Data Engineer are a plus.
Additional Information:
• Rate offered: £500-550/ day (Outside IR35)
• Location: Hybrid with travel to client site 1-2 day/ week in London
• Start date: Mid/ end of September
• Duration: 3 months initial sign up with significant opportunity for extension (current roadmap demonstrates 9 months scope of work)
• Required: Active or willing to undergo SC Clearance
This is an exciting contract opportunity for an SC Cleared Azure Data Engineer with a strong focus on Databricks to join an experienced team in a new customer engagement working at the forefront of data analytics and AI. This role offers the chance to take a key role in the design and delivery of advanced Databricks solutions within the Azure ecosystem.
Responsibilities:
• Design, build, and optimise end-to-end data pipelines using Azure Databricks, including Delta Live Tables.
• Collaborate with stakeholders to define technical requirements and propose Databricks-based solutions.
• Drive best practices for data engineering.
• Help clients realise the potential of data science, machine learning, and scaled data processing within Azure / Databricks ecosystem.
• Mentor junior engineers and support their personal development.
• Take ownership for the delivery of core solution components.
• Support with planning, requirements refinements, and work estimation.
Skills & Experiences:
• Proven experience designing and implementing data solutions in Azure using Databricks as a core platform.
• Hands-on expertise in Delta Lake, Delta Live Tables and Databricks Workflows.
• Strong coding skills in Python and SQL, with experience in developing modular, reusable code in Databricks.
• Deep understanding of lakehouse architecture, with a solid grasp of data warehousing, data lakes, and real-time data processing.
• Familiar with Azure Data Factory, Azure Synapse Analytics, and Microsoft Fabric.
• Good experience with CI/CD practices and tools for data platforms using Azure DevOps.
• Good knowledge on how to leverage AI to increase deployment productivity and quality.
• Excellent communication skills.
• Desirable: certification in Databricks Data Engineer and/or Azure Data Engineer are a plus.
Additional Information:
• Rate offered: £500-550/ day (Outside IR35)
• Location: Hybrid with travel to client site 1-2 day/ week in London
• Start date: Mid/ end of September
• Duration: 3 months initial sign up with significant opportunity for extension (current roadmap demonstrates 9 months scope of work)
• Required: Active or willing to undergo SC Clearance