

ShareForce
Databricks Engineer - SC Cleared
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an SC Cleared Databricks Engineer with a contract length of 3 months, offering £500-550 per day. Key skills include Azure Databricks, Delta Lake, Python, and SQL. Hybrid work with travel to the North East is required.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
550
-
🗓️ - Date
November 29, 2025
🕒 - Duration
3 to 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Outside IR35
-
🔒 - Security
Yes
-
📍 - Location detailed
Glasgow City, Scotland, United Kingdom
-
🧠 - Skills detailed
#Data Processing #Data Lake #ADF (Azure Data Factory) #AI (Artificial Intelligence) #Azure Databricks #Python #SQL (Structured Query Language) #Azure #Data Science #Data Pipeline #DevOps #Databricks #Azure DevOps #Deployment #Delta Lake #Azure Data Factory #Synapse #Azure Synapse Analytics #ML (Machine Learning) #Data Engineering
Role description
This is an exciting contract opportunity for an SC Cleared Databricks Engineer to join an experienced team in a new customer engagement working at the forefront of data analytics and AI. This role offers the chance to take a key role in the design and delivery of advanced Databricks solutions within the Azure ecosystem.
Responsibilities
• Design, build, and optimise end-to-end data pipelines using Azure Databricks, including Delta Live Tables.
• Collaborate with stakeholders to define technical requirements and propose Databricks-based solutions.
• Drive best practices for data engineering.
• Help clients realise the potential of data science, machine learning, and scaled data processing within Azure / Databricks ecosystem.
• Mentor junior engineers and support their personal development.
• Take ownership for the delivery of core solution components.
• Support with planning, requirements refinements, and work estimation.
Skills & Experiences
• Proven experience designing and implementing data solutions in Azure using Databricks as a core platform.
• Hands-on expertise in Delta Lake, Delta Live Tables and Databricks Workflows.
• Strong coding skills in Python and SQL, with experience in developing modular, reusable code in Databricks.
• Deep understanding of lakehouse architecture, with a solid grasp of data warehousing, data lakes, and real-time data processing.
• Familiar with Azure Data Factory, Azure Synapse Analytics, and Microsoft Fabric.
• Good experience with CI/CD practices and tools for data platforms using Azure DevOps.
• Good knowledge on how to leverage AI to increase deployment productivity and quality.
• Excellent communication skills.
• Desirable: certification in Databricks Data Engineer and/or Azure Data Engineer are a plus.
Additional Information
• Rate offered: £500-550 per day
• IR35 Status: Outside
• Location: Hybrid with travel to client site in the North East
• Start date: January
• Duration: 3 months initial sign up with significant opportunity for extension
• Required: Active SC Clearance
This is an exciting contract opportunity for an SC Cleared Databricks Engineer to join an experienced team in a new customer engagement working at the forefront of data analytics and AI. This role offers the chance to take a key role in the design and delivery of advanced Databricks solutions within the Azure ecosystem.
Responsibilities
• Design, build, and optimise end-to-end data pipelines using Azure Databricks, including Delta Live Tables.
• Collaborate with stakeholders to define technical requirements and propose Databricks-based solutions.
• Drive best practices for data engineering.
• Help clients realise the potential of data science, machine learning, and scaled data processing within Azure / Databricks ecosystem.
• Mentor junior engineers and support their personal development.
• Take ownership for the delivery of core solution components.
• Support with planning, requirements refinements, and work estimation.
Skills & Experiences
• Proven experience designing and implementing data solutions in Azure using Databricks as a core platform.
• Hands-on expertise in Delta Lake, Delta Live Tables and Databricks Workflows.
• Strong coding skills in Python and SQL, with experience in developing modular, reusable code in Databricks.
• Deep understanding of lakehouse architecture, with a solid grasp of data warehousing, data lakes, and real-time data processing.
• Familiar with Azure Data Factory, Azure Synapse Analytics, and Microsoft Fabric.
• Good experience with CI/CD practices and tools for data platforms using Azure DevOps.
• Good knowledge on how to leverage AI to increase deployment productivity and quality.
• Excellent communication skills.
• Desirable: certification in Databricks Data Engineer and/or Azure Data Engineer are a plus.
Additional Information
• Rate offered: £500-550 per day
• IR35 Status: Outside
• Location: Hybrid with travel to client site in the North East
• Start date: January
• Duration: 3 months initial sign up with significant opportunity for extension
• Required: Active SC Clearance






