Haystack

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with active SC Clearance, offering £500 - £510/day for a 1-year contract. Key skills include Java, Node.js, Python, PySpark, and cloud experience (AWS/Azure), focusing on data integrity and complex system documentation.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
510
-
🗓️ - Date
February 28, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Inside IR35
-
🔒 - Security
Yes
-
📍 - Location detailed
United Kingdom
-
🧠 - Skills detailed
#Data Integrity #Azure #Java #JSON (JavaScript Object Notation) #RDS (Amazon Relational Database Service) #React #Data Engineering #Spark (Apache Spark) #"ETL (Extract #Transform #Load)" #PySpark #Data Quality #Documentation #AWS Glue #Cloud #AWS (Amazon Web Services) #Pytest #Azure cloud #Data Pipeline #Python #Databricks
Role description
Data Engineer | Remote | £500 - £510/day We're working with a leading global workforce solutions powerhouse that specializes in high-impact digital transformation and large-scale technical consultancy on this exciting opportunity. This is a high-impact role for a Data Engineer who excels at architectural detective work, focusing on documenting and deconstructing complex systems using a diverse stack including Java, Node.js, and Python. You will play a pivotal role in a major decommissioning programme, mapping end-to-end data flows and ensuring data integrity across AWS/Azure cloud environments. The Role • Lead the technical discovery and deep-dive analysis of existing legacy solutions built on Java, Node JS, and React to support strategic decommissioning. • Map complex end-to-end data flows and system dependencies across a wide-scale programme, ensuring every integration point is accounted for. • Utilize Python and PySpark to interrogate data pipelines, confirming how data transforms and moves through RDS and JSON structures. • Develop comprehensive data models and system documentation that serve as the "source of truth" for risk assessment and final decommissioning decisions. • Champion data quality and accuracy, leveraging PyTest and Great Expectations to ensure all documented flows are verified and trusted by stakeholders. What You'll Need • Active SC Clearance is mandatory for this role due to the nature of the programme data. • Strong technical background with hands-on experience in Java, Node JS, and React, specifically for reverse-engineering and system analysis. • Expert-level knowledge of Data Modelling, Python, and PySpark for analyzing large-scale cloud data pipelines. • Proven experience in Cloud environments (AWS or Azure), specifically utilizing tools like AWS Glue, RDS, and ideally Databricks. • Rigorous approach to testing and Data Quality management using frameworks like PyTest and Great Expectations. What's On Offer • Competitive day rate of £500 - £510 (Inside IR35 - Umbrella). • 100% Remote working flexibility within the UK for a healthy work-life balance. • Long-term stability with a 1-year initial contract duration on a massive transformation programme. • Opportunity to work on a high-visibility project involving a modern tech stack (Databricks, Spark, AWS). Apply via Haystack today!