

McCabe & Barton
Databricks Consultant
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Databricks Consultant on a 6-month contract in London with hybrid work. Requires strong Databricks and Azure experience, proficiency in Python and SQL, and relevant certifications. Must have expertise in data pipelines, ETL/ELT processes, and data architecture.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
May 2, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Deployment #Storage #Scala #Security #Azure Data Factory #Data Engineering #Monitoring #Python #Data Pipeline #ADF (Azure Data Factory) #Delta Lake #GIT #SQL (Structured Query Language) #ADLS (Azure Data Lake Storage) #Data Governance #Data Quality #Azure DevOps #Azure #Databricks #DevOps #Datasets #Cloud #Data Lake #Data Architecture #Terraform #Azure ADLS (Azure Data Lake Storage) #"ETL (Extract #Transform #Load)"
Role description
Data Engineer (Azure & Databricks) - 6-month Contract
We are seeking an experienced Senior Data Engineer with strong Databricks expertise to design, build, and maintain a scalable cloud-based data platform for a private equity firm. This is a 6-month contract based in London with hybrid working.
Key Responsibilities
• Design and build robust data pipelines using Azure Data Factory and Databricks
• Develop ETL/ELT processes to transform raw data into analytics-ready datasets
• Architect and maintain data lake solutions using Azure Data Lake Storage
• Implement Databricks solutions using Delta Lake and medallion architecture (bronze, silver, gold)
• Optimise performance, data quality, and cost efficiency across pipelines and clusters
• Set up monitoring, alerting, and ensure high availability of data services
• Collaborate with data teams and stakeholders to deliver business-focused solutions
Essential Skills & Experience
• Strong hands-on experience with Databricks (essential)
• Proven experience with Azure services
• Proficiency in Python and SQL
• Experience with CI/CD, Git, and deployment pipelines
• Solid understanding of data modelling and data architecture
• Knowledge of data governance and security best practices
• Experience with Terraform or Infrastructure-as-Code
• Azure DevOps
• Relevant Azure or Databricks certifications
Data Engineer (Azure & Databricks) - 6-month Contract
We are seeking an experienced Senior Data Engineer with strong Databricks expertise to design, build, and maintain a scalable cloud-based data platform for a private equity firm. This is a 6-month contract based in London with hybrid working.
Key Responsibilities
• Design and build robust data pipelines using Azure Data Factory and Databricks
• Develop ETL/ELT processes to transform raw data into analytics-ready datasets
• Architect and maintain data lake solutions using Azure Data Lake Storage
• Implement Databricks solutions using Delta Lake and medallion architecture (bronze, silver, gold)
• Optimise performance, data quality, and cost efficiency across pipelines and clusters
• Set up monitoring, alerting, and ensure high availability of data services
• Collaborate with data teams and stakeholders to deliver business-focused solutions
Essential Skills & Experience
• Strong hands-on experience with Databricks (essential)
• Proven experience with Azure services
• Proficiency in Python and SQL
• Experience with CI/CD, Git, and deployment pipelines
• Solid understanding of data modelling and data architecture
• Knowledge of data governance and security best practices
• Experience with Terraform or Infrastructure-as-Code
• Azure DevOps
• Relevant Azure or Databricks certifications






