

Haystack
Data Analyst
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Analyst with an active SC Clearance, offering £500 - £510 per day for a 1-year remote contract. Key skills include Python, PySpark, AWS, and experience with Java, Node JS, and React in a cloud environment.
🌎 - Country
United States
💱 - Currency
£ GBP
-
💰 - Day rate
510
-
🗓️ - Date
April 17, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Inside IR35
-
🔒 - Security
Yes
-
📍 - Location detailed
London, England, United Kingdom
-
🧠 - Skills detailed
#Storage #Datasets #Data Lineage #"ETL (Extract #Transform #Load)" #React #Cloud #Security #Pytest #Spark (Apache Spark) #Data Pipeline #RDS (Amazon Relational Database Service) #Java #Databricks #AWS Glue #PySpark #AWS (Amazon Web Services) #JSON (JavaScript Object Notation) #Data Architecture #Data Quality #Python #Data Analysis #Data Integrity #Migration #Automated Testing
Role description
Data Analyst | £500 - £510
We're working with a leading digital transformation consultancy delivering mission-critical public sector infrastructure on this exciting opportunity.
We are searching for a technically-astute Data Analyst with a background in engineering to spearhead a major decommissioning program. You will bridge the gap between legacy codebases and modern data architecture, utilizing Python and PySpark to map complex data flows within a high-security AWS environment.
The Role
• Lead the forensic analysis of existing systems built on Java, Node JS, and React to document end-to-end data lineage.
• Reverse engineer complex data models and system dependencies to support critical decommissioning and migration decisions.
• Use Python, PySpark, and AWS Glue to validate data pipelines and confirm how information transforms across the wider program.
• Ensure data integrity and accuracy by implementing rigorous testing frameworks using PyTest and Great Expectations.
• Collaborate with cross-functional teams to clarify ownership, integration points, and RDS storage structures.
What You'll Need
• An active SC Clearance is mandatory due to the secure nature of the environment.
• Strong technical background with the ability to read and understand Java, Node JS, and React codebases.
• Advanced proficiency in Python, PySpark, and working with JSON/RDS datasets.
• Proven experience in Cloud environments (AWS preferred, Databricks is a significant plus).
• Expertise in Data Quality management and automated testing (PyTest, Great Expectations).
What's On Offer
• Competitive day rate of £500 - £510 (Inside IR35).
• 100% Remote working flexibility for a better work-life balance.
• Long-term stability with a 1-year contract duration.
• The chance to work on large-scale, impactful datasets within a high-profile secure program.
Apply via Haystack today!
Data Analyst | £500 - £510
We're working with a leading digital transformation consultancy delivering mission-critical public sector infrastructure on this exciting opportunity.
We are searching for a technically-astute Data Analyst with a background in engineering to spearhead a major decommissioning program. You will bridge the gap between legacy codebases and modern data architecture, utilizing Python and PySpark to map complex data flows within a high-security AWS environment.
The Role
• Lead the forensic analysis of existing systems built on Java, Node JS, and React to document end-to-end data lineage.
• Reverse engineer complex data models and system dependencies to support critical decommissioning and migration decisions.
• Use Python, PySpark, and AWS Glue to validate data pipelines and confirm how information transforms across the wider program.
• Ensure data integrity and accuracy by implementing rigorous testing frameworks using PyTest and Great Expectations.
• Collaborate with cross-functional teams to clarify ownership, integration points, and RDS storage structures.
What You'll Need
• An active SC Clearance is mandatory due to the secure nature of the environment.
• Strong technical background with the ability to read and understand Java, Node JS, and React codebases.
• Advanced proficiency in Python, PySpark, and working with JSON/RDS datasets.
• Proven experience in Cloud environments (AWS preferred, Databricks is a significant plus).
• Expertise in Data Quality management and automated testing (PyTest, Great Expectations).
What's On Offer
• Competitive day rate of £500 - £510 (Inside IR35).
• 100% Remote working flexibility for a better work-life balance.
• Long-term stability with a 1-year contract duration.
• The chance to work on large-scale, impactful datasets within a high-profile secure program.
Apply via Haystack today!





