Experis UK

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with active SC clearance, lasting 1 year, remote work. Key skills include Java, Node JS, Python, PySpark, and cloud experience (AWS/Azure). Strong experience in data quality management and testing is required.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 1, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Inside IR35
-
🔒 - Security
Yes
-
📍 - Location detailed
City Of London, England, United Kingdom
-
🧠 - Skills detailed
#Data Engineering #Cloud #"ETL (Extract #Transform #Load)" #Data Quality #Databricks #React #AWS (Amazon Web Services) #Data Pipeline #PySpark #Java #Spark (Apache Spark) #Azure #Python #AWS Glue #JSON (JavaScript Object Notation) #Pytest #RDS (Amazon Relational Database Service)
Role description
Onsite Requirements: Remote Start Date: ASAP Role Duration: 1 year Clerance Requirements: Active SC clearance Inside IR35 - umbrella only Role Description We're looking for a Data Engineer whose main focus is understanding and documenting existing systems, with the goal of supporting decommissioning activities. The role centres on analysing current solutions built using Java, Node JS, and React, and developing a clear, end to end picture of how data flows across the wider programme. This includes documenting data flows, system dependencies, and underlying data models, ensuring there is a clear record of how data is structured, stored, and used throughout the solution. The role involves investigating how systems are used on a day-to-day basis, clarifying ownership and integration points, and capturing this information in a way that supports risk assessment and decommissioning decisions. Responsibilities Python and PySpark are required as supporting capabilities, used where needed to analyse data pipelines and confirm how data moves and transforms in practice. The role also requires strong experience with testing and data quality management, ensuring that documented data flows and models are accurate and trusted. Experience working in cloud environments such as AWS or Azure is expected, with Databricks considered a nice to have. Required Skills • Java background • Node JS • Json • RDS • React • Data Modelling • Python / Spark • Cloud experience (AWS / Azure) o AWS Glue o Databricks • Testing e.g. PyTest • Data Quality e.g. Great Expectations