Lucid Support Services Ltd

Lead Data Engineer (SC Cleared)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead Data Engineer (SC Cleared) on a 3-month rolling contract, remote with occasional site visits. Key skills include POLE data modelling, AWS services (Flink, Glue), Java, and PostgreSQL. Active SC Clearance is required.
🌎 - Country
United Kingdom
πŸ’± - Currency
Β£ GBP
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
April 22, 2026
πŸ•’ - Duration
3 to 6 months
-
🏝️ - Location
Remote
-
πŸ“„ - Contract
Fixed Term
-
πŸ”’ - Security
Yes
-
πŸ“ - Location detailed
United Kingdom
-
🧠 - Skills detailed
#Security #Data Pipeline #Logging #Data Engineering #Data Processing #Java #Leadership #PostgreSQL #AWS Glue #Datasets #AWS (Amazon Web Services) #Compliance #Databases #Schema Design
Role description
Lead Data Engineer (SC Cleared) β€’ 3‑month rolling contract β€’ Remote working (with occasional site visit to regional hub Sheffield/Croydon) β€’ Active SC Clearance required Overview We are seeking an SC Cleared Lead Data Engineer to support a UK Central Government programme focused on building secure, auditable, and intelligence‑led data platforms. You will provide technical leadership across data engineering activities, setting standards for data modelling, schema design, and auditability, while working closely with architects, analysts, and delivery teams. This role is highly hands-on and suited to a senior engineer capable of translating complex raw data into structured, trustworthy, and compliant datasets aligned to the POLE (Person, Object, Location, Event) data model. Key Responsibilities β€’ Lead the design and delivery of secure, auditable data pipelines within a government environment β€’ Structure, normalise, and integrate raw data sources into a POLE-aligned data model β€’ Design and implement schemas aligned to POLE entities, ensuring consistency, extensibility, and performance β€’ Build and optimise temporal data models, supporting historical truth and time-based analysis β€’ Implement permission enforcement mechanisms using permission identifiers to control access at data and query level β€’ Develop data processing workloads using AWS services, including Flink and Glue β€’ Write and maintain high-quality Java-based data processing code β€’ Design, implement, and optimise queries in PostgreSQL (or equivalent relational databases) β€’ Ensure data solutions meet audit, compliance, and information assurance requirements β€’ Embed database-level audit logging, capturing: β€’ Who accessed data β€’ What data was accessed β€’ When access occurred β€’ Why access was permitted β€’ Act as a technical authority, mentoring engineers and setting best practices across the team β€’ Collaborate with security, architecture, and governance stakeholders to ensure platform compliance Essential Skills & Experience β€’ Strong experience designing and implementing temporal databases β€’ Proven expertise in POLE data modelling, including: β€’ Structuring raw data into POLE β€’ Designing schemas aligned to POLE entities β€’ Experience implementing fine-grained permission enforcement using permission identifiers β€’ Hands-on experience with AWS data services, particularly: β€’ AWS Flink β€’ AWS Glue β€’ Strong Java development background in data-intensive systems β€’ Advanced experience with PostgreSQL or equivalent relational databases β€’ Deep understanding of: β€’ Schema design β€’ Query optimisation β€’ Data normalisation strategies β€’ Strong experience building auditable data systems, including: β€’ Database-level audit logging β€’ Secure access tracking and traceability β€’ Ability to work independently in a remote, delivery-focused environment Desirable Experience β€’ Experience working within UK Central Government or public sector programmes β€’ Familiarity with data used in law enforcement, intelligence, or investigative contexts β€’ Experience designing systems for high-assurance or regulated environments β€’ Prior technical leadership or lead engineer experience on complex data platforms