La Fosse

Senior Data Engineer - Contract Role

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer on a 7-month contract, paying £500-525/day, based in Milton Keynes or Manchester. Requires 7+ years of experience, expertise in Databricks and Azure, and strong skills in data pipeline design and optimization.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
525
-
🗓️ - Date
November 22, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Outside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
Manchester Area, United Kingdom
-
🧠 - Skills detailed
#PySpark #Azure #Databricks #Data Pipeline #Scala #Python #SQL (Structured Query Language) #Cloud #Spark (Apache Spark) #dbt (data build tool) #Data Processing #"ETL (Extract #Transform #Load)" #Data Quality #Data Engineering
Role description
Senior Data Engineer Contract Role Outside IR35 | £500-525/day | Milton Keynes or Manchester | Hybrid | Initial 7 Months We have a new Senior Data Engineer contract opportunity with a global leading client with offices in both Manchester and Milton Keynes. You’ll join a high-impact team focused on rebuilding and optimising data pipelines into a Databricks Lakehouse environment, enabling clean, scalable, and high-quality data delivery for analytics and reporting. This is outside IR35, requires 1 day per week on-site, and offers an initial 7-month contract with an immediate start and strong potential for extension on a greenfield project. Key Responsibilities • Design, build, and optimise data pipelines using ELT/ETL best practices. • Migrate and transform data into a Databricks Lakehouse architecture. • Ensure data quality, reliability, and scalability for analytics and reporting. • Collaborate with stakeholders to deliver robust solutions aligned with business needs. • Support production environments and troubleshoot performance issues. Ideal Candidate • 7+ years in Data Engineering with strong experience in cloud-based data platforms. • Proven ability to design and optimise pipelines for large-scale data processing. • Hands-on experience with Databricks and Azure. • Strong stakeholder communication and problem-solving skills. Tech Stack Required: • Databricks • DBT • Python • PySpark • SQL • Azure Bonus: Experience in eCommerce environments. If you are interested please apply below!