Amber Labs

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (SC Cleared) on a 12-month contract, remote in the UK with bi-monthly travel to London. Key skills include Microsoft Fabric, PySpark, and CI/CD. Active SC clearance is required, with public sector experience preferred.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 6, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Fixed Term
-
🔒 - Security
Yes
-
📍 - Location detailed
United Kingdom
-
🧠 - Skills detailed
#Data Transformations #Automation #Terraform #Microsoft Power BI #Synapse #BI (Business Intelligence) #Monitoring #AI (Artificial Intelligence) #DevOps #Dimensional Modelling #Automated Testing #PySpark #Agile #Spark (Apache Spark) #dbt (data build tool) #Scala #Infrastructure as Code (IaC) #"ETL (Extract #Transform #Load)" #Batch #GIT #GitLab #Azure #Data Pipeline #Data Quality #Data Processing #Jira #Compliance #Spark SQL #Code Reviews #ML (Machine Learning) #Delta Lake #Security #Version Control #Data Engineering #Python #SQL (Structured Query Language)
Role description
Data Engineer (SC Cleared) – (London Bi-Monthly Travel) Location: Remote (UK) with travel to London twice a month Contract: 12-month Fixed Term Contract Security Clearance: SC Active (required) About Amber Labs Amber Labs is a specialist consultancy delivering high-quality digital services across the public and private sectors. We work closely with UK central government departments to design, build, and improve user-centred digital products that meet the highest standards of accessibility, quality, and security. The Role Amber Labs is seeking an SC cleared Senior Data Engineer to design, build, and operate mission-critical data solutions that power analytics across complex public-sector environments. You will lead the development of scalable data pipelines on Microsoft Fabric, modernise legacy data estates, and mentor junior engineers to deliver high-quality, secure, and reliable data products. What You’ll Do • Engineer production-grade data pipelines on Microsoft Fabric (One Lake/Delta Lake, Data Factory, Synapse Notebook Data Engineering) using PySpark, Spark SQL, Python, and SQL. • Deliver high-performance, resilient, and observable ETL/ELT pipelines with strong testing coverage and monitoring. • Support reporting & MI use cases, including data transformations and data models that feed downstream tools such as Power BI. • Own CI/CD and version control practices (Git/GitLab), perform code reviews, and enforce engineering standards across squads. • Coach and mentor engineers, provide technical guidance, and contribute to architectural decisions across multiple delivery teams. • Work in Agile delivery, collaborating with product, data, and platform teams using Jira/Confluence. • Translate stakeholder requirements into robust engineering tasks and deliverables. • Embed security and compliance by design, operating within BPSS/SC constraints and adhering to departmental data-handling policies. Essential Skills & Experience • Hands-on expertise in Microsoft Fabric: One Lake/Delta Lake, Data Factory, Synapse Data Engineering (Notebook), with PySpark/Spark SQL/Python. • Strong experience in large-scale batch data engineering within government or similarly regulated domains. • Proven ability to deliver performance tuning, data quality management, and resilient data processing. • CI/CD & DevOps experience: pipeline automation, IaC (Terraform), automated testing, and release governance. • Strong version control and collaboration skills using Git/GitLab, with structured branching and PR workflows. • Experience building/consuming APIs and data services for safe and reliable data movement and exposure. • Comfortable working in Agile environments using Jira and Confluence. • Active SC clearance for UK government work. Desirable Skills • Data warehousing and modelling experience (e.g., dimensional modelling, dbt). • Familiarity with Power BI to support end-to-end data validation and BI delivery. • Experience working with machine learning/AI data pipelines or Azure AI services. Certifications (Nice to Have) • Microsoft Fabric Associate Data Engineer (or higher) • Azure AI Fundamentals Why Amber Labs? Amber Labs is committed to delivering high-impact data and technology solutions across public-sector programmes. You’ll be working in a collaborative environment with opportunities to shape architecture, improve delivery standards, and directly impact critical public services. Why Join Us? • Join a fast-growing consultancy delivering high-impact public-sector technology • Develop skills across modern front end engineering, accessibility, and large-scale digital platforms • Private Medical Insurance (Aviva) • Company Pension Plan (Nest) • 25 days annual leave + UK bank holidays • Perkbox – global employee rewards and wellbeing platform • Generous employee referral scheme • Supportive, inclusive environment encouraging innovation and growth • Choose your preferred working device – Mac or PC Diversity & Inclusion at Amber Labs At Amber Labs, diversity fuels innovation. We are committed to fostering a workplace where everyone feels valued, supported, and respected. We: • Welcome a variety of backgrounds, experiences, and perspectives • Promote equality and inclusion across all teams • Maintain a workplace free from discrimination, bullying, and harassment Important Information • We cannot accept applications from candidates requiring visa sponsorship • BPSS eligibility is required before starting • Applicants must be based in the UK and able to travel occasionally to Manchester What Happens Next? After reviewing your application, our Talent Acquisition team will be in touch to outline next steps. The process typically includes two interview stages, with a possible third involving our company Partners if required.