

McCabe & Barton
Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer on a contract basis, hybrid in London, offering a competitive pay rate. Requires 5+ years in Azure services, Databricks, SQL, and Python. Familiarity with Terraform and Azure Data Engineer certifications is desirable.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
800
-
🗓️ - Date
February 15, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#PySpark #Programming #Azure Data Factory #Synapse #Security #Monitoring #Data Engineering #Azure Databricks #Data Pipeline #Data Lake #Databricks #Documentation #Data Science #Deployment #Scala #Disaster Recovery #Storage #Data Quality #Azure DevOps #Spark (Apache Spark) #Snowflake #Data Governance #ADF (Azure Data Factory) #Azure #Infrastructure as Code (IaC) #DevOps #Python #Cloud #"ETL (Extract #Transform #Load)" #ADLS (Azure Data Lake Storage) #Azure ADLS (Azure Data Lake Storage) #Azure SQL Database #Azure SQL #Delta Lake #Terraform #GitHub #GIT #SQL (Structured Query Language)
Role description
Senior Data Engineer – Contract Role
Hybrid (3 days in the office, 2 days WFH)
London
McCabe & Barton are partnering with a leading financial services client to recruit an experienced Data Platform Engineer. This is an excellent opportunity to join a forward-thinking team driving innovation with modern cloud-based data technologies.
Role Overview
As a Data Platform Engineer, you will design, build, and maintain scalable cloud-based data infrastructure using Azure and Databricks. You’ll play a key role in ensuring that data pipelines, architecture, and analytics environments are reliable, performant, and secure.
Key Responsibilities
Platform Development & Maintenance
• Design and implement data pipelines using Azure Data Factory, Databricks, and related Azure services.
• Build ETL/ELT processes to transform raw data into structured, analytics-ready formats.
• Optimise pipeline performance and ensure high availability of data services.
Infrastructure & Architecture
• Architect and deploy scalable data lake solutions using Azure Data Lake Storage.
• Implement governance and security measures across the platform.
• Leverage Terraform or similar IaC tools for controlled and reproducible deployments.
Databricks Development
• Develop and optimise data jobs using PySpark or Scala within Databricks.
• Implement the medallion architecture (bronze, silver, gold layers) and use Delta Lake for reliable data transactions.
• Manage cluster configurations and CI/CD pipelines for Databricks deployments.
Monitoring & Operations
• Implement monitoring solutions using Azure Monitor, Log Analytics, and Databricks tools.
• Optimise performance, ensure SLAs are met, and establish disaster recovery and backup strategies.
Collaboration & Documentation
• Partner with data scientists, analysts, and business stakeholders to deliver effective solutions.
• Document technical designs, data flows, and operational procedures for knowledge sharing.
Essential Skills & Experience
• 5+ years of experience with Azure services (Azure Data Factory, ADLS, Azure SQL Database, Synapse Analytics).
• Strong hands-on expertise in Databricks, Delta Lake, and cluster management.
• Proficiency in SQL and Python for pipeline development.
• Familiarity with Git/GitHub and CI/CD practices.
• Understanding of data modelling, data governance, and security principles.
Desirable Skills
• Experience with Terraform or other Infrastructure-as-Code tools.
• Familiarity with Azure DevOps or similar CI/CD platforms.
• Experience with data quality frameworks and testing.
• Azure Data Engineer or Databricks certifications.
Please apply with an updated CV if you align to the key skills required!
• Seniority Level
• Mid-Senior level
• Industry
• Financial Services
• Staffing and Recruiting
• Engineering Services
• Employment Type
• Contract
• Job Functions
• Finance
• Engineering
• Information Technology
• Skills
• Platform Development
• Azure Databricks
• Azure Data Factory
• Continuous Integration and Continuous Delivery (CI/CD)
• Extract, Transform, Load (ETL)
• Snowflake
• Terraform
• Python (Programming Language)
• Service-Level
Senior Data Engineer – Contract Role
Hybrid (3 days in the office, 2 days WFH)
London
McCabe & Barton are partnering with a leading financial services client to recruit an experienced Data Platform Engineer. This is an excellent opportunity to join a forward-thinking team driving innovation with modern cloud-based data technologies.
Role Overview
As a Data Platform Engineer, you will design, build, and maintain scalable cloud-based data infrastructure using Azure and Databricks. You’ll play a key role in ensuring that data pipelines, architecture, and analytics environments are reliable, performant, and secure.
Key Responsibilities
Platform Development & Maintenance
• Design and implement data pipelines using Azure Data Factory, Databricks, and related Azure services.
• Build ETL/ELT processes to transform raw data into structured, analytics-ready formats.
• Optimise pipeline performance and ensure high availability of data services.
Infrastructure & Architecture
• Architect and deploy scalable data lake solutions using Azure Data Lake Storage.
• Implement governance and security measures across the platform.
• Leverage Terraform or similar IaC tools for controlled and reproducible deployments.
Databricks Development
• Develop and optimise data jobs using PySpark or Scala within Databricks.
• Implement the medallion architecture (bronze, silver, gold layers) and use Delta Lake for reliable data transactions.
• Manage cluster configurations and CI/CD pipelines for Databricks deployments.
Monitoring & Operations
• Implement monitoring solutions using Azure Monitor, Log Analytics, and Databricks tools.
• Optimise performance, ensure SLAs are met, and establish disaster recovery and backup strategies.
Collaboration & Documentation
• Partner with data scientists, analysts, and business stakeholders to deliver effective solutions.
• Document technical designs, data flows, and operational procedures for knowledge sharing.
Essential Skills & Experience
• 5+ years of experience with Azure services (Azure Data Factory, ADLS, Azure SQL Database, Synapse Analytics).
• Strong hands-on expertise in Databricks, Delta Lake, and cluster management.
• Proficiency in SQL and Python for pipeline development.
• Familiarity with Git/GitHub and CI/CD practices.
• Understanding of data modelling, data governance, and security principles.
Desirable Skills
• Experience with Terraform or other Infrastructure-as-Code tools.
• Familiarity with Azure DevOps or similar CI/CD platforms.
• Experience with data quality frameworks and testing.
• Azure Data Engineer or Databricks certifications.
Please apply with an updated CV if you align to the key skills required!
• Seniority Level
• Mid-Senior level
• Industry
• Financial Services
• Staffing and Recruiting
• Engineering Services
• Employment Type
• Contract
• Job Functions
• Finance
• Engineering
• Information Technology
• Skills
• Platform Development
• Azure Databricks
• Azure Data Factory
• Continuous Integration and Continuous Delivery (CI/CD)
• Extract, Transform, Load (ETL)
• Snowflake
• Terraform
• Python (Programming Language)
• Service-Level






