Oliver Bernard

Senior Data Engineer - Contract

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer on a 12-month contract in London (Hybrid). Pay ranges from £700 to £800 per day. Key skills include Python, Apache Spark, Apache Airflow, and AWS experience. Strong data engineering background required.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
800
-
🗓️ - Date
January 17, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Inside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Python #Cloud #Redshift #AWS (Amazon Web Services) #Lambda (AWS Lambda) #Spark (Apache Spark) #Data Architecture #Data Science #Infrastructure as Code (IaC) #Airflow #Datasets #Data Quality #Scala #Terraform #S3 (Amazon Simple Storage Service) #"ETL (Extract #Transform #Load)" #Data Pipeline #Batch #Agile #Apache Airflow #Data Engineering #Apache Spark
Role description
Senior Data Engineer (Contract) Location: London (Hybrid – 3 days per week on-site) Contract Length: 12 months Rate: £700 – £800 per day IR35 Status: Inside IR35 Start Date: ASAP / Flexible Overview We are seeking an experienced Senior Data Engineer to join a high-performing data platform team working on large-scale, cloud-based data solutions. This role will play a key part in designing, building, and maintaining robust data pipelines and platforms that support analytics, reporting, and downstream data products. Key Responsibilities • Design, build, and maintain scalable, reliable data pipelines using Python and Apache Spark • Develop and orchestrate workflows using Apache Airflow • Build and optimise data solutions on AWS (e.g. S3, Glue, EMR, Redshift, Lambda) • Collaborate with data scientists, analysts, and product teams to deliver high-quality data products • Ensure data quality, reliability, and performance across pipelines • Contribute to architectural decisions and best practices for data engineering • Troubleshoot, monitor, and optimise existing data workflows • Work in an agile environment, contributing to sprint planning and technical discussions Required Skills & Experience • Strong commercial experience as a Senior Data Engineer • Advanced Python development for data engineering • Hands-on experience with Apache Spark (batch and/or streaming) • Proven experience with Apache Airflow for workflow orchestration • Strong experience working within AWS cloud environments • Solid understanding of data modelling, ETL/ELT patterns, and data warehousing concepts • Experience working with large-scale, complex datasets • Strong communication skills and ability to work in a collaborative team environment Desirable Experience • Experience with CI/CD for data pipelines • Knowledge of infrastructure as code (e.g. Terraform, CloudFormation) • Exposure to real-time or streaming data architectures • Experience in regulated or enterprise-scale environments Additional Information • Hybrid working: 3 days per week on-site in London, 2 days remote • Inside IR35 contract • Competitive daily rate depending on experience