

Ceox Services Ltd
Senior Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer on a freelance contract, remote within the UK, focusing on Azure, Databricks, and ETL pipelines. Key skills include Python, PySpark, SQL, and experience in UK Public Sector environments is desirable.
π - Country
United Kingdom
π± - Currency
Β£ GBP
-
π° - Day rate
Unknown
-
ποΈ - Date
December 10, 2025
π - Duration
Unknown
-
ποΈ - Location
Remote
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
United Kingdom
-
π§ - Skills detailed
#Data Ingestion #Documentation #Agile #ADLS (Azure Data Lake Storage) #Python #BI (Business Intelligence) #Programming #Spark (Apache Spark) #Integration Testing #Dataflow #PySpark #Physical Data Model #Data Layers #Data Engineering #Cloud #Delta Lake #Databricks #Datasets #Data Lake #SQL (Structured Query Language) #Scala #Synapse #Deployment #Microsoft Power BI #ADF (Azure Data Factory) #Azure #"ETL (Extract #Transform #Load)" #Security
Role description
Data Engineer (Azure | Databricks | ETL Pipelines)
Contract | Remote | UK Public Sector Project
Ceox Services are seeking a Data Engineer to support delivery of key data components within our evolving data ecosystem. This role will work closely with architects, analysts, engineers, and wider delivery teams to implement pipeline builds, data models, and integration patterns aligned to the organisationβs new Target Operating Model for Data
.You will contribute to the development of scalable dataflows, optimise ingestion & transformation activities, and ensure solutions meet technical standards, performance expectations, and security controls. This is a hands-on engineering role ideal for someone who enjoys building, shaping, and improving modern data platforms
.
Who Weβre Looking F
orA highly capable Data Engineer with hands-on experience building ETL pipelines, implementing Data Lake + Delta architectures, and working across Azure platform services. You will need strong problem-solving ability, a solid engineering mindset, and the ability to work collaboratively within a multi-disciplinary tea
m.If youβre confident writing production-grade code, applying design patterns, and building scalable, high-performing data products, this role offers technical autonomy, ownership, and challeng
e.
Key Responsibilit
β’ iesBuild and maintain physical data models, ETL pipelines and code in cloud data platfor
β’ ms.Support ingestion activity and onboarding of new data sourc
β’ es.Assist in design, development and deployment of Azure platform services (Fabric, Synapse, ADL
β’ S).Work with Databricks, Delta Lake, Unity Catalogue, Delta Share for dataflows and collaborati
β’ on.Construct curated, raw and refined data layers; catalogue assets appropriate
β’ ly.Validate solutions against functional and non-functional requiremen
β’ ts.Deliver datasets, transformations and performance-optimised data produc
β’ ts.Improve processes, engineering patterns, and reusable tooli
β’ ng.Monitor and measure pipeline performance; support incident resoluti
β’ on.Ensure documentation meets acceptance standards and is approved central
β’ ly.Actively engage in Agile ceremonies and governance foru
ms.
Mandatory Requirem
β’ entsStrong experience with Python, PySpark & SQL for data engineer
β’ ing.Hands-on experience with Azure Databri
β’ cks.Strong knowledge of Fabric, Synapse, ADF & ADLS for ETL pipeli
β’ nes.Experience with Delta Lake, Parquet FS, Unity Catalogue & MS Purv
β’ iew.Familiarity with Event-driven data ingestion (Event Grid / Pub-S
β’ ub).Understanding of SOLID principles, Async programming, Mediator/Factory patte
β’ rns.Experience delivering unit + integration testing in Databri
β’ cks.Knowledge of Secure ETL design with Entra IMID/SCIM integrat
β’ ion.Understanding of Azure best practice, APIM, and platform governa
β’ nce.Ability to build and serve Power BI models via Databricks data sour
ces.
Desi
β’ rablePrior experience working within UK Public Sector environm
ents.
Soft
β’ SkillsStrong stakeholder communication and cross-team collabor
β’ ation.Analytical and solution-focused mi
β’ ndset.Able to work independently, take ownership and drive pro
β’ gress.Commitment to clean, scalable, well-documented engine
β’ ering.Adaptable, proactive, and comfortable working in dynamic delivery environ
ments.
Contract
β’ DetailsContract Type: Freelance / C
β’ ontractLocation: Remote (UK-based candidates pre
β’ ferred)Start Dat
β’ e: ASAPClearance: Candidates must be eligible to work with UK Government departments
/ BPSS.
Data Engineer (Azure | Databricks | ETL Pipelines)
Contract | Remote | UK Public Sector Project
Ceox Services are seeking a Data Engineer to support delivery of key data components within our evolving data ecosystem. This role will work closely with architects, analysts, engineers, and wider delivery teams to implement pipeline builds, data models, and integration patterns aligned to the organisationβs new Target Operating Model for Data
.You will contribute to the development of scalable dataflows, optimise ingestion & transformation activities, and ensure solutions meet technical standards, performance expectations, and security controls. This is a hands-on engineering role ideal for someone who enjoys building, shaping, and improving modern data platforms
.
Who Weβre Looking F
orA highly capable Data Engineer with hands-on experience building ETL pipelines, implementing Data Lake + Delta architectures, and working across Azure platform services. You will need strong problem-solving ability, a solid engineering mindset, and the ability to work collaboratively within a multi-disciplinary tea
m.If youβre confident writing production-grade code, applying design patterns, and building scalable, high-performing data products, this role offers technical autonomy, ownership, and challeng
e.
Key Responsibilit
β’ iesBuild and maintain physical data models, ETL pipelines and code in cloud data platfor
β’ ms.Support ingestion activity and onboarding of new data sourc
β’ es.Assist in design, development and deployment of Azure platform services (Fabric, Synapse, ADL
β’ S).Work with Databricks, Delta Lake, Unity Catalogue, Delta Share for dataflows and collaborati
β’ on.Construct curated, raw and refined data layers; catalogue assets appropriate
β’ ly.Validate solutions against functional and non-functional requiremen
β’ ts.Deliver datasets, transformations and performance-optimised data produc
β’ ts.Improve processes, engineering patterns, and reusable tooli
β’ ng.Monitor and measure pipeline performance; support incident resoluti
β’ on.Ensure documentation meets acceptance standards and is approved central
β’ ly.Actively engage in Agile ceremonies and governance foru
ms.
Mandatory Requirem
β’ entsStrong experience with Python, PySpark & SQL for data engineer
β’ ing.Hands-on experience with Azure Databri
β’ cks.Strong knowledge of Fabric, Synapse, ADF & ADLS for ETL pipeli
β’ nes.Experience with Delta Lake, Parquet FS, Unity Catalogue & MS Purv
β’ iew.Familiarity with Event-driven data ingestion (Event Grid / Pub-S
β’ ub).Understanding of SOLID principles, Async programming, Mediator/Factory patte
β’ rns.Experience delivering unit + integration testing in Databri
β’ cks.Knowledge of Secure ETL design with Entra IMID/SCIM integrat
β’ ion.Understanding of Azure best practice, APIM, and platform governa
β’ nce.Ability to build and serve Power BI models via Databricks data sour
ces.
Desi
β’ rablePrior experience working within UK Public Sector environm
ents.
Soft
β’ SkillsStrong stakeholder communication and cross-team collabor
β’ ation.Analytical and solution-focused mi
β’ ndset.Able to work independently, take ownership and drive pro
β’ gress.Commitment to clean, scalable, well-documented engine
β’ ering.Adaptable, proactive, and comfortable working in dynamic delivery environ
ments.
Contract
β’ DetailsContract Type: Freelance / C
β’ ontractLocation: Remote (UK-based candidates pre
β’ ferred)Start Dat
β’ e: ASAPClearance: Candidates must be eligible to work with UK Government departments
/ BPSS.






