

Harnham
Data Engineer - Databricks Contract
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Contract Data Engineer with strong Databricks and Python experience, focusing on data ingestion and migration in a cloud environment. The contract lasts 4 months, offers a competitive day rate, and is based in the Midlands.
π - Country
United Kingdom
π± - Currency
Β£ GBP
-
π° - Day rate
Unknown
-
ποΈ - Date
February 10, 2026
π - Duration
3 to 6 months
-
ποΈ - Location
Hybrid
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Birmingham
-
π§ - Skills detailed
#Batch #Storage #"ETL (Extract #Transform #Load)" #Azure #Data Migration #CRM (Customer Relationship Management) #Snowflake #Databricks #Migration #Python #Data Engineering #Data Warehouse #API (Application Programming Interface) #Cloud #Datasets #Scala #Data Governance #Data Ingestion
Role description
A leading UK organisation is undergoing a major customer and data transformation programme and is looking for an experienced Contract Data Engineer to support a large-scale platform and data migration.
This is a high-impact role within a fast-paced environment, suited to a contractor who can operate autonomously, bring structure to complex data challenges, and implement best-practice ingestion and engineering processes across a modern cloud stack. THE ROLE
You will join a critical transformation programme focused on consolidating multiple legacy platforms into a new customer engagement ecosystem. The role will centre around building and optimising data ingestion pipelines, supporting real-time integrations, and enabling analytics-ready datasets across the business.
This is a hands-on engineering role requiring strong experience across Databricks, Python, and modern cloud data environments, as well as the ability to work across multiple third-party systems and stakeholders. KEY RESPONSIBILITIES
β’ Design, build and optimise real-time and batch data ingestion pipelines.
β’ Implement best practice data engineering and architecture across Databricks and Azure.
β’ Support migration and consolidation of data from multiple legacy systems into a new platform ecosystem.
β’ Cleanse, merge, and deduplicate complex customer and transactional datasets.
β’ Integrate data across multiple sources including CRM, loyalty platforms, and third-party systems.
β’ Develop real-time ingestion solutions using APIs and webhooks.
β’ Build and maintain analytics-ready datasets and data models.
β’ Support complex data modelling and transformation logic for reporting and analytics.
β’ Review and enhance existing Databricks architecture to ensure scalability and performance.
β’ Develop API-led solutions to manage data updates and deletions across systems.
β’ Support ingestion from cloud storage (including Azure Blob) and bespoke data processes.
β’ Ensure strong data governance practices across ingestion, deletion, and data consistency. SKILLS & EXPERIENCE
Essential:
β’ Strong Python experience for data engineering.
β’ Proven track record building real-time and batch ingestion pipelines.
β’ Hands-on Databricks experience within production environments.
β’ Experience working within modern cloud data platforms (Azure preferred).
β’ Strong experience integrating data via APIs and webhooks.
β’ Background in complex data migration, consolidation, and cleansing projects.
β’ Ability to work independently within fast-moving, high-pressure programmes.
β’ Strong stakeholder engagement and problem-solving capability.
Desirable:
β’ Experience working with customer, CRM, or loyalty data platforms.
β’ Exposure to Snowflake or similar cloud data warehouses.
β’ Experience in large-scale customer or data transformation programmes.
β’ Experience implementing data engineering best practice in evolving environments. CONTRACT DETAILS
β’ Day Rate: Competitive (DOE)
β’ Location: Midlands-based office (ideally 1 day per week; flexible for strong candidates able to commute weekly)
β’ Start Date: ASAP
β’ Duration: Initial 4 months with strong extension potential
If you are a hands-on Data Engineer with strong ingestion and Databricks experience looking to join a high-impact transformation programme, please apply to find out more.
A leading UK organisation is undergoing a major customer and data transformation programme and is looking for an experienced Contract Data Engineer to support a large-scale platform and data migration.
This is a high-impact role within a fast-paced environment, suited to a contractor who can operate autonomously, bring structure to complex data challenges, and implement best-practice ingestion and engineering processes across a modern cloud stack. THE ROLE
You will join a critical transformation programme focused on consolidating multiple legacy platforms into a new customer engagement ecosystem. The role will centre around building and optimising data ingestion pipelines, supporting real-time integrations, and enabling analytics-ready datasets across the business.
This is a hands-on engineering role requiring strong experience across Databricks, Python, and modern cloud data environments, as well as the ability to work across multiple third-party systems and stakeholders. KEY RESPONSIBILITIES
β’ Design, build and optimise real-time and batch data ingestion pipelines.
β’ Implement best practice data engineering and architecture across Databricks and Azure.
β’ Support migration and consolidation of data from multiple legacy systems into a new platform ecosystem.
β’ Cleanse, merge, and deduplicate complex customer and transactional datasets.
β’ Integrate data across multiple sources including CRM, loyalty platforms, and third-party systems.
β’ Develop real-time ingestion solutions using APIs and webhooks.
β’ Build and maintain analytics-ready datasets and data models.
β’ Support complex data modelling and transformation logic for reporting and analytics.
β’ Review and enhance existing Databricks architecture to ensure scalability and performance.
β’ Develop API-led solutions to manage data updates and deletions across systems.
β’ Support ingestion from cloud storage (including Azure Blob) and bespoke data processes.
β’ Ensure strong data governance practices across ingestion, deletion, and data consistency. SKILLS & EXPERIENCE
Essential:
β’ Strong Python experience for data engineering.
β’ Proven track record building real-time and batch ingestion pipelines.
β’ Hands-on Databricks experience within production environments.
β’ Experience working within modern cloud data platforms (Azure preferred).
β’ Strong experience integrating data via APIs and webhooks.
β’ Background in complex data migration, consolidation, and cleansing projects.
β’ Ability to work independently within fast-moving, high-pressure programmes.
β’ Strong stakeholder engagement and problem-solving capability.
Desirable:
β’ Experience working with customer, CRM, or loyalty data platforms.
β’ Exposure to Snowflake or similar cloud data warehouses.
β’ Experience in large-scale customer or data transformation programmes.
β’ Experience implementing data engineering best practice in evolving environments. CONTRACT DETAILS
β’ Day Rate: Competitive (DOE)
β’ Location: Midlands-based office (ideally 1 day per week; flexible for strong candidates able to commute weekly)
β’ Start Date: ASAP
β’ Duration: Initial 4 months with strong extension potential
If you are a hands-on Data Engineer with strong ingestion and Databricks experience looking to join a high-impact transformation programme, please apply to find out more.






