

Experis UK
Data Analyst
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Analyst with active SC clearance, lasting 1 year, offering remote work. Key skills include Java, Node JS, Python, PySpark, and cloud experience (AWS/Azure). Strong data quality management and testing experience are essential.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 21, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Inside IR35
-
🔒 - Security
Yes
-
📍 - Location detailed
City Of London, England, United Kingdom
-
🧠 - Skills detailed
#Java #AWS Glue #Spark (Apache Spark) #Data Pipeline #"ETL (Extract #Transform #Load)" #Databricks #Data Quality #PySpark #Data Analysis #Data Engineering #RDS (Amazon Relational Database Service) #Cloud #AWS (Amazon Web Services) #Python #JSON (JavaScript Object Notation) #React #Azure #Pytest
Role description
Onsite Requirements: Remote
Start Date: ASAP
Role Duration: 1 year
Clerance Requirements: Active SC clearance
Inside IR35 - umbrella only
Role Description
We're looking for a Data Engineer whose main focus is understanding and documenting existing systems, with the goal of supporting decommissioning activities. The role centres on analysing current solutions built using Java, Node JS, and React, and developing a clear, end to end picture of how data flows across the wider programme.
This includes documenting data flows, system dependencies, and underlying data models, ensuring there is a clear record of how data is structured, stored, and used throughout the solution. The role involves investigating how systems are used on a day-to-day basis, clarifying ownership and integration points, and capturing this information in a way that supports risk assessment and decommissioning decisions.
Responsibilities
Python and PySpark are required as supporting capabilities, used where needed to analyse data pipelines and confirm how data moves and transforms in practice. The role also requires strong experience with testing and data quality management, ensuring that documented data flows and models are accurate and trusted. Experience working in cloud environments such as AWS or Azure is expected, with Databricks considered a nice to have.
Required Skills
• Java background
• Node JS
• Json
• RDS
• React
• Data Modelling
• Python / Spark
• Cloud experience (AWS / Azure) o AWS Glue o Databricks
• Testing e.g. PyTest
• Data Quality e.g. Great Expectations
Onsite Requirements: Remote
Start Date: ASAP
Role Duration: 1 year
Clerance Requirements: Active SC clearance
Inside IR35 - umbrella only
Role Description
We're looking for a Data Engineer whose main focus is understanding and documenting existing systems, with the goal of supporting decommissioning activities. The role centres on analysing current solutions built using Java, Node JS, and React, and developing a clear, end to end picture of how data flows across the wider programme.
This includes documenting data flows, system dependencies, and underlying data models, ensuring there is a clear record of how data is structured, stored, and used throughout the solution. The role involves investigating how systems are used on a day-to-day basis, clarifying ownership and integration points, and capturing this information in a way that supports risk assessment and decommissioning decisions.
Responsibilities
Python and PySpark are required as supporting capabilities, used where needed to analyse data pipelines and confirm how data moves and transforms in practice. The role also requires strong experience with testing and data quality management, ensuring that documented data flows and models are accurate and trusted. Experience working in cloud environments such as AWS or Azure is expected, with Databricks considered a nice to have.
Required Skills
• Java background
• Node JS
• Json
• RDS
• React
• Data Modelling
• Python / Spark
• Cloud experience (AWS / Azure) o AWS Glue o Databricks
• Testing e.g. PyTest
• Data Quality e.g. Great Expectations






