Oliver Bernard

Senior Data Engineer - Contract

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer (Contract) in London, hybrid (3 days on-site), for 12 months at £800 per day. Requires advanced Python, Apache Spark, AWS expertise, and experience with data pipelines, ETL/ELT patterns, and large datasets.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
800
-
🗓️ - Date
February 25, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Inside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Datasets #Agile #Terraform #Batch #S3 (Amazon Simple Storage Service) #AWS (Amazon Web Services) #Apache Airflow #Data Quality #Data Science #Cloud #Data Pipeline #Scala #Data Architecture #Lambda (AWS Lambda) #"ETL (Extract #Transform #Load)" #Infrastructure as Code (IaC) #Redshift #Spark (Apache Spark) #Airflow #Data Engineering #Python #Apache Spark
Role description
Senior Data Engineer (Contract) Location: London (Hybrid – 3 days per week on-site) Contract Length: 12 months Rate: £800 per day IR35 Status: Inside IR35 Start Date: ASAP / Flexible Overview We are seeking an experienced Senior Data Engineer to join a high-performing data platform team working on large-scale, cloud-based data solutions. This role will play a key part in designing, building, and maintaining robust data pipelines and platforms that support analytics, reporting, and downstream data products. Key Responsibilities • Design, build, and maintain scalable, reliable data pipelines using Python and Apache Spark • Develop and orchestrate workflows using Apache Airflow • Build and optimise data solutions on AWS (e.g. S3, Glue, EMR, Redshift, Lambda) • Collaborate with data scientists, analysts, and product teams to deliver high-quality data products • Ensure data quality, reliability, and performance across pipelines • Contribute to architectural decisions and best practices for data engineering • Troubleshoot, monitor, and optimise existing data workflows • Work in an agile environment, contributing to sprint planning and technical discussions Required Skills & Experience • Strong commercial experience as a Senior Data Engineer • Advanced Python development for data engineering • Hands-on experience with Apache Spark (batch and/or streaming) • Proven experience with Apache Airflow for workflow orchestration • Strong experience working within AWS cloud environments • Solid understanding of data modelling, ETL/ELT patterns, and data warehousing concepts • Experience working with large-scale, complex datasets • Strong communication skills and ability to work in a collaborative team environment Desirable Experience • Experience with CI/CD for data pipelines • Knowledge of infrastructure as code (e.g. Terraform, CloudFormation) • Exposure to real-time or streaming data architectures • Experience in regulated or enterprise-scale environments Additional Information • Hybrid working: 3 days per week on-site in London, 2 days remote • Inside IR35 contract • Competitive daily rate depending on experience