

Searchability NS&D
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with active Enhanced DV Clearance, offering £680 p/d inside IR35 for a 12-month, full-time on-site position in Central London. Key skills include ETL development, Python, Apache Spark, and cloud services.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
600
-
🗓️ - Date
February 14, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Inside IR35
-
🔒 - Security
Yes
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#PostgreSQL #Kafka (Apache Kafka) #Distributed Computing #Apache Spark #Data Pipeline #Batch #"ETL (Extract #Transform #Load)" #MongoDB #Palantir Foundry #NoSQL #AI (Artificial Intelligence) #Stories #NiFi (Apache NiFi) #Databases #MySQL #AWS (Amazon Web Services) #Data Ingestion #Big Data #Spark (Apache Spark) #Datasets #Data Engineering #Security #Database Schema #Python #Storage #Cloud #Data Security
Role description
New Contract Opportunity for an eDV Cleared Data Engineer with a leading National Security Consultancy in London.
• Active Enhanced DV Clearance required
• £680 p/d inside IR35
• 12-month engagement
• Full-time on-site in Central London
The Role:
You will design and deliver mission-critical data services for National Security clients. Working across AI, cyber security, cloud, big data and digital transformation initiatives, you will help shape the future of national security.
You will build and manage robust data pipelines that convert raw, diverse data sources (batch, streaming, real-time and unstructured) into reliable, analysis-ready datasets, using distributed computing techniques to process data at scale.
Responsibilities:
• Design and build data ingestion pipelines and orchestration tooling.
• Develop database schemas and data models.
• Integrate and enrich data from multiple sources, ensuring quality and consistency.
• Design and implement ETL processes (e.g. using NiFi).
• Write clean, secure, test-driven code that is reusable by default.
• Maintain and enhance data ingestion and storage architectures.
• Investigate and resolve data issues within operational environments.
• Translate user needs into clear technical requirements, supporting backlog refinement into epics and stories.
• Implement appropriate data security controls.
• Monitor, maintain and optimise data platforms to ensure performance and reliability.
Skills and Experience Required:
• ETL development using Python or similar languages
• Apache Spark, NiFi, or Kafka
• Relational (PostgreSQL, MySQL) and NoSQL (MongoDB, Cassandra) databases
• AWS and cloud-based data services
• Palantir Foundry
• Stakeholder Engagement
Why Apply?
• Work on mission-critical UK Government projects with real-world impact
• Join a leading consultancy known for innovation and technical excellence
• Long-term engagement with competitive day rate and secure on-site environment
New Contract Opportunity for an eDV Cleared Data Engineer with a leading National Security Consultancy in London.
• Active Enhanced DV Clearance required
• £680 p/d inside IR35
• 12-month engagement
• Full-time on-site in Central London
The Role:
You will design and deliver mission-critical data services for National Security clients. Working across AI, cyber security, cloud, big data and digital transformation initiatives, you will help shape the future of national security.
You will build and manage robust data pipelines that convert raw, diverse data sources (batch, streaming, real-time and unstructured) into reliable, analysis-ready datasets, using distributed computing techniques to process data at scale.
Responsibilities:
• Design and build data ingestion pipelines and orchestration tooling.
• Develop database schemas and data models.
• Integrate and enrich data from multiple sources, ensuring quality and consistency.
• Design and implement ETL processes (e.g. using NiFi).
• Write clean, secure, test-driven code that is reusable by default.
• Maintain and enhance data ingestion and storage architectures.
• Investigate and resolve data issues within operational environments.
• Translate user needs into clear technical requirements, supporting backlog refinement into epics and stories.
• Implement appropriate data security controls.
• Monitor, maintain and optimise data platforms to ensure performance and reliability.
Skills and Experience Required:
• ETL development using Python or similar languages
• Apache Spark, NiFi, or Kafka
• Relational (PostgreSQL, MySQL) and NoSQL (MongoDB, Cassandra) databases
• AWS and cloud-based data services
• Palantir Foundry
• Stakeholder Engagement
Why Apply?
• Work on mission-critical UK Government projects with real-world impact
• Join a leading consultancy known for innovation and technical excellence
• Long-term engagement with competitive day rate and secure on-site environment






