Nicoll Curtin

Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer (Contract) focusing on Azure and Databricks, offering £600/day for a hybrid position. Key skills required include Azure, Databricks, Python, and Spark SQL, with experience in data governance and real-time processing essential.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
600
-
🗓️ - Date
October 28, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Outside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
Leicester, England, United Kingdom
-
🧠 - Skills detailed
#ADF (Azure Data Factory) #"ETL (Extract #Transform #Load)" #Cloud #Data Lake #Spark SQL #SQL (Structured Query Language) #DevOps #Data Governance #Terraform #Azure DevOps #Scala #Spark (Apache Spark) #Data Architecture #Infrastructure as Code (IaC) #Azure #Synapse #Data Engineering #Data Quality #"ACID (Atomicity #Consistency #Isolation #Durability)" #Python #Apache Spark #Azure Data Factory #Delta Lake #Databricks
Role description
Senior Data Engineer (Contract) | Azure & Databricks We’re looking for a Senior Data Engineer to join a growing data team (currently three members plus a lead) and play a key role in strengthening the group’s engineering capability. The focus will be on migrating and integrating data platforms, cleaning and documenting legacy data, and contributing to the design and build of the new data architecture. This is an opportunity to get involved early in the platform’s development, close to Greenfield, and help establish engineering standards and best practices that will support long-term scalability. The environment is primarily Databricks-based, with legacy data quality posing some interesting challenges that the team is keen to resolve properly this time around. What you’ll be doing: • Building and optimising ETL/ELT pipelines with Databricks & Apache Spark • Designing scalable solutions using Azure Data Factory, Synapse & Data Lake • Implementing Delta Lake with ACID transactions and schema enforcement • Automating testing and CI/CD with Azure DevOps & Terraform • Driving data governance with Azure Purview & Unity Catalog • Developing real-time data solutions with Event Hubs & Structured Streaming • Managing infrastructure as code and optimising cloud costs • Working closely with engineers, QA, Product Owner, and stakeholders What we’re looking for: • Strong hands-on experience with Azure & Databricks • Proficiency in Python, Spark SQL, and Terraform • Solid understanding of data governance and real-time processing • Great communication skills and a collaborative mindset £600/day - outside IR35 | Hybrid | Contract | Modern tech stack | Friendly team If this sounds like something you’d be interested in, feel free to reach out or share with someone who might be