

Lead Data Engineer | Databricks | 12-Month Contract
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead Data Engineer on a 12-month contract, focusing on Databricks and Azure Data Lake. Key skills include streaming data, Delta Live Tables, and troubleshooting performance issues. Experience in cloud environments and data lakehouse architectures is essential.
π - Country
United Kingdom
π± - Currency
Β£ GBP
-
π° - Day rate
-
ποΈ - Date discovered
July 24, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
London Area, United Kingdom
-
π§ - Skills detailed
#Data Lake #"ETL (Extract #Transform #Load)" #Databricks #Data Pipeline #Cloud #Azure #Data Engineering #Data Lakehouse #Qlik
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Lead Data Engineer | Databricks | 12-Month Contract
I'm currently working with a Global Energy Trading company that is embarking on a large-scale digital transformation programme. At the heart of this initiative is the development of a modern data lakehouse architecture on Azure.
This is a very technical role, ideal for someone who thrives in complex cloud environments and has hands-on experience with:
β’ Streaming data
β’ Delta Live Tables (DLT)
β’ Azure Data Lake
You will be helping to design and optimise streaming pipelines, troubleshoot performance issues, and shape the overall Databricks architecture.
Main Skills Needed:
β’ Strong experience with Databricks in a production environment
β’ Deep understanding of Delta Live Tables (DLT), especially for streaming use cases
β’ Proven background in Azure Data Lake and data lakehouse architectures
β’ Hands-on expertise in streaming data pipelines (structured streaming in continuous mode)
β’ Experience troubleshooting performance and latency issues with large data volumes
β’ Familiarity with Qlik Replicate or similar CDC tools
If this sounds like a good fit, please send over your updated CV.