McCabe & Barton

Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Senior Data Engineer contract position in London, offering £600-800 inside IR35 for 6 months. Requires 5+ years of Azure experience, strong Databricks skills, and proficiency in SQL and Python. Azure Data Engineer or Databricks certifications preferred.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
800
-
🗓️ - Date
January 16, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Inside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Synapse #Data Governance #Data Quality #Azure SQL #Databricks #Data Pipeline #Azure Data Factory #ADLS (Azure Data Lake Storage) #Security #Documentation #Azure DevOps #Data Engineering #"ETL (Extract #Transform #Load)" #Deployment #Infrastructure as Code (IaC) #Python #Scala #Terraform #PySpark #SQL (Structured Query Language) #GIT #Delta Lake #Spark (Apache Spark) #Azure #GitHub #Azure ADLS (Azure Data Lake Storage) #DevOps #Azure SQL Database #Monitoring #Disaster Recovery #Storage #Data Lake #Data Science #Cloud #ADF (Azure Data Factory)
Role description
Senior Data Engineer – Contract - £600-800 inside IR35 Hybrid (3 days in the office, 2 days WFH) London McCabe & Barton are partnering with a leading financial services client to recruit an experienced Data Platform Engineer. This is an excellent opportunity to join a forward-thinking team driving innovation with modern cloud-based data technologies. Role Overview As a Data Platform Engineer, you will design, build, and maintain scalable cloud-based data infrastructure using Azure and Databricks. You’ll play a key role in ensuring that data pipelines, architecture, and analytics environments are reliable, performant, and secure. Key Responsibilities Platform Development & Maintenance • Design and implement data pipelines using Azure Data Factory, Databricks, and related Azure services. • Build ETL/ELT processes to transform raw data into structured, analytics-ready formats. • Optimise pipeline performance and ensure high availability of data services. Infrastructure & Architecture • Architect and deploy scalable data lake solutions using Azure Data Lake Storage. • Implement governance and security measures across the platform. • Leverage Terraform or similar IaC tools for controlled and reproducible deployments. Databricks Development • Develop and optimise data jobs using PySpark or Scala within Databricks. • Implement the medallion architecture (bronze, silver, gold layers) and use Delta Lake for reliable data transactions. • Manage cluster configurations and CI/CD pipelines for Databricks deployments. Monitoring & Operations • Implement monitoring solutions using Azure Monitor, Log Analytics, and Databricks tools. • Optimise performance, ensure SLAs are met, and establish disaster recovery and backup strategies. Collaboration & Documentation • Partner with data scientists, analysts, and business stakeholders to deliver effective solutions. • Document technical designs, data flows, and operational procedures for knowledge sharing. Essential Skills & Experience • 5+ years of experience with Azure services (Azure Data Factory, ADLS, Azure SQL Database, Synapse Analytics). • Strong hands-on expertise in Databricks, Delta Lake, and cluster management. • Proficiency in SQL and Python for pipeline development. • Familiarity with Git/GitHub and CI/CD practices. • Understanding of data modelling, data governance, and security principles. Desirable Skills • Experience with Terraform or other Infrastructure-as-Code tools. • Familiarity with Azure DevOps or similar CI/CD platforms. • Experience with data quality frameworks and testing. • Azure Data Engineer or Databricks certifications. Please apply with an updated CV if you align to the key skills required!