Synergize Consulting

Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with active security clearance, based in Reading (Hybrid). Contract length is unspecified, paying up to £84 p/h Inside IR35. Key skills include Python, SQL, cloud platforms, and experience with large-scale datasets, preferably in aerospace.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
672
-
🗓️ - Date
April 3, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Inside IR35
-
🔒 - Security
Yes
-
📍 - Location detailed
Reading, England, United Kingdom
-
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Data Pipeline #Automation #Big Data #Infrastructure as Code (IaC) #Terraform #Data Science #NoSQL #SQL (Structured Query Language) #Azure #Deployment #Database Architecture #NiFi (Apache NiFi) #Scala #Spark (Apache Spark) #Data Engineering #GCP (Google Cloud Platform) #ML (Machine Learning) #Security #Datasets #Python #AWS (Amazon Web Services) #Cloud #DevOps #Ansible #Hadoop #Storage #Docker
Role description
Senior Data Engineer Active security clearance is required Reading (Hybrid – on-site as required) Up to £84 p/h Inside IR35 A leading Defence prime based in Reading are looking for a security cleared Senior Data Engineer to play a key role in shaping data capability within a greenfield, mission-critical Defence & Aerospace environment. This is a hands-on opportunity to design and deliver secure, scalable data pipelines and platforms supporting complex engineering and operational datasets including high-volume sensor and aerospace data If you enjoy building from scratch, solving complex data challenges, and working in secure, high impact environments, this role is built for you. Key skills • Live security clearance (SC) - This is essential • Proven experience as a Data Engineer in complex environments • Strong Python and SQL skills, with experience building and optimising pipelines • Solid understanding of SQL and NoSQL database architectures • Experience working with large-scale or high-volume datasets (ideally aerospace or sensor data) • Experience with ETL pipelines, orchestration tools and APIs • Hands-on experience with cloud platforms (AWS, Azure, or GCP) • Familiarity with Docker and modern engineering tooling • Exposure to big data ecosystems (Spark, Hadoop, NiFi) • Familiarity with Infrastructure as Code (Terraform, Ansible) • Understanding of machine learning models and deployment Key responsibilities • Designing and delivering end-to-end data solutions (ingestion, integration, storage, processing, analysis) • Building robust data pipelines and ETL workflows for complex, high-volume datasets • Developing tools and automation to enable data-driven insights • Working with cloud platforms (AWS, Azure, GCP) in secure environments • Supporting machine learning workflows and data science initiatives • Collaborating with DevOps, Data Scientists, Analysts, and Cyber teams • Leveraging big data and orchestration technologies (Spark and NiFi) • Leading small work packages and mentoring junior engineers • Contributing to technical design decisions and engineering best practice