

Data Engineer - Azure / DataBricks (Outside IR35)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer - Azure / Databricks, offering a 3-month contract at £500-550 per day. Requires expertise in Azure Databricks, Python, SQL, and data pipelines. Hybrid work with 1-2 days onsite in London; SC Clearance needed.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
550
-
🗓️ - Date discovered
September 13, 2025
🕒 - Project duration
3 to 6 months
-
🏝️ - Location type
Hybrid
-
📄 - Contract type
Outside IR35
-
🔒 - Security clearance
Yes
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Azure #Azure Synapse Analytics #SQL (Structured Query Language) #Databricks #Azure DevOps #Data Engineering #Azure Data Factory #Delta Lake #Synapse #Deployment #DevOps #Python #Azure Databricks #Data Lake #Data Processing #ADF (Azure Data Factory) #ML (Machine Learning) #AI (Artificial Intelligence) #Data Pipeline #Data Science
Role description
This is an exciting contract opportunity for an Azure Data Engineer with a strong focus on Databricks to join an experienced team in a new customer engagement working at the forefront of data analytics and AI. This role offers the chance to take a key role in the design and delivery of advanced Databricks solutions within the Azure ecosystem.
Responsibilities:
• Design, build, and optimise end-to-end data pipelines using Azure Databricks, including Delta Live Tables.
• Collaborate with stakeholders to define technical requirements and propose Databricks-based solutions.
• Drive best practices for data engineering.
• Help clients realise the potential of data science, machine learning, and scaled data processing within Azure / Databricks ecosystem.
• Mentor junior engineers and support their personal development.
• Take ownership for the delivery of core solution components.
• Support with planning, requirements refinements, and work estimation.
Skills & Experiences:
• Proven experience designing and implementing data solutions in Azure using Databricks as a core platform.
• Hands-on expertise in Delta Lake, Delta Live Tables and Databricks Workflows.
• Strong coding skills in Python and SQL, with experience in developing modular, reusable code in Databricks.
• Deep understanding of lakehouse architecture, with a solid grasp of data warehousing, data lakes, and real-time data processing.
• Familiar with Azure Data Factory, Azure Synapse Analytics, and Microsoft Fabric.
• Good experience with CI/CD practices and tools for data platforms using Azure DevOps.
• Good knowledge on how to leverage AI to increase deployment productivity and quality.
• Excellent communication skills.
• Desirable: certification in Databricks Data Engineer and/or Azure Data Engineer are a plus.
Additional Information:
• Rate offered: £500-550 per day
• Location: Hybrid with travel to client site 1-2 day/ week in London
• Start date: October
• Duration: 3 months initial sign up with significant opportunity for extension (current roadmap demonstrates 9 months scope of work)
• Required: Applicants must have, or be willing to obtain, SC Clearance (minimum 5 years’ residency in the UK).
This is an exciting contract opportunity for an Azure Data Engineer with a strong focus on Databricks to join an experienced team in a new customer engagement working at the forefront of data analytics and AI. This role offers the chance to take a key role in the design and delivery of advanced Databricks solutions within the Azure ecosystem.
Responsibilities:
• Design, build, and optimise end-to-end data pipelines using Azure Databricks, including Delta Live Tables.
• Collaborate with stakeholders to define technical requirements and propose Databricks-based solutions.
• Drive best practices for data engineering.
• Help clients realise the potential of data science, machine learning, and scaled data processing within Azure / Databricks ecosystem.
• Mentor junior engineers and support their personal development.
• Take ownership for the delivery of core solution components.
• Support with planning, requirements refinements, and work estimation.
Skills & Experiences:
• Proven experience designing and implementing data solutions in Azure using Databricks as a core platform.
• Hands-on expertise in Delta Lake, Delta Live Tables and Databricks Workflows.
• Strong coding skills in Python and SQL, with experience in developing modular, reusable code in Databricks.
• Deep understanding of lakehouse architecture, with a solid grasp of data warehousing, data lakes, and real-time data processing.
• Familiar with Azure Data Factory, Azure Synapse Analytics, and Microsoft Fabric.
• Good experience with CI/CD practices and tools for data platforms using Azure DevOps.
• Good knowledge on how to leverage AI to increase deployment productivity and quality.
• Excellent communication skills.
• Desirable: certification in Databricks Data Engineer and/or Azure Data Engineer are a plus.
Additional Information:
• Rate offered: £500-550 per day
• Location: Hybrid with travel to client site 1-2 day/ week in London
• Start date: October
• Duration: 3 months initial sign up with significant opportunity for extension (current roadmap demonstrates 9 months scope of work)
• Required: Applicants must have, or be willing to obtain, SC Clearance (minimum 5 years’ residency in the UK).