

Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (Data Product Engineer) for a 6-month contract, paying "pay rate." It requires strong Databricks and Spark SQL skills, experience in regulated industries, and familiarity with metadata frameworks. Hybrid work model: 2 days onsite.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
-
🗓️ - Date discovered
August 28, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Hybrid
-
📄 - Contract type
Inside IR35
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Strategy #Metadata #Azure #Databricks #ML (Machine Learning) #Delta Lake #AI (Artificial Intelligence) #SQL (Structured Query Language) #Spark SQL #Cloud #Data Engineering #Vault #"ETL (Extract #Transform #Load)" #Spark (Apache Spark)
Role description
Data Product Engineer
Contract Length: 6 months
Engagement: Inside IR35
Days on site: 2 days a week
Interview stages: 2 stages
A long-established financial institution with a strong reputation for stability and innovation is investing heavily in its data transformation strategy. With a mission to better understand customers, manage risk, and unlock data-driven opportunities, the organisation is building a modern cloud-based data platform to support analytics, reporting, and AI initiatives.
The Role
As a Data Product Engineer, you’ll be at the heart of building the bank’s Common Data Model — a governed, reusable foundation of core entities such as Customer, Account, and Transaction. Using Databricks, Delta Lake, and Microsoft Fabric, you’ll design and deliver high-quality, well-documented data products that embed trusted business logic and serve as a single source of truth across the organisation.
What You’ll Do
• Design, build, and maintain data products and pipelines on Databricks and Fabric.
• Develop performant transformations in Spark SQL and Delta Lake.
• Collaborate with business SMEs, analysts, and data modellers to translate requirements into reliable assets.
• Apply governance and metadata standards using Unity Catalog, Purview, and Fabric.
• Optimise performance and ensure resilience of services in a regulated environment.
• Contribute to data engineering standards and mentor within the data community.
What We’re Looking For
• Proven experience engineering on Databricks (Azure preferred) with Delta Lake.
• Strong Spark SQL skills, with ability to deliver maintainable queries at scale.
• Experience building modular data models and applying complex business logic.
• Understanding of data modelling techniques (canonical, vault, dimensional).
• Familiarity with metadata and governance frameworks such as Unity Catalog.
• Strong communication skills to work across technical and non-technical teams.
• Background in regulated industries (banking, insurance, healthcare) is an advantage.
Nice to Have
• Exposure to AI/ML data workflows.
• Familiarity with Microsoft Purview.
Data Product Engineer
Contract Length: 6 months
Engagement: Inside IR35
Days on site: 2 days a week
Interview stages: 2 stages
A long-established financial institution with a strong reputation for stability and innovation is investing heavily in its data transformation strategy. With a mission to better understand customers, manage risk, and unlock data-driven opportunities, the organisation is building a modern cloud-based data platform to support analytics, reporting, and AI initiatives.
The Role
As a Data Product Engineer, you’ll be at the heart of building the bank’s Common Data Model — a governed, reusable foundation of core entities such as Customer, Account, and Transaction. Using Databricks, Delta Lake, and Microsoft Fabric, you’ll design and deliver high-quality, well-documented data products that embed trusted business logic and serve as a single source of truth across the organisation.
What You’ll Do
• Design, build, and maintain data products and pipelines on Databricks and Fabric.
• Develop performant transformations in Spark SQL and Delta Lake.
• Collaborate with business SMEs, analysts, and data modellers to translate requirements into reliable assets.
• Apply governance and metadata standards using Unity Catalog, Purview, and Fabric.
• Optimise performance and ensure resilience of services in a regulated environment.
• Contribute to data engineering standards and mentor within the data community.
What We’re Looking For
• Proven experience engineering on Databricks (Azure preferred) with Delta Lake.
• Strong Spark SQL skills, with ability to deliver maintainable queries at scale.
• Experience building modular data models and applying complex business logic.
• Understanding of data modelling techniques (canonical, vault, dimensional).
• Familiarity with metadata and governance frameworks such as Unity Catalog.
• Strong communication skills to work across technical and non-technical teams.
• Background in regulated industries (banking, insurance, healthcare) is an advantage.
Nice to Have
• Exposure to AI/ML data workflows.
• Familiarity with Microsoft Purview.