

Harnham
Databricks Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Databricks Data Engineer with a contract length of "unknown", offering £500 - £550 per day, outside IR35. Key skills required include DBT modelling, Python or Scala proficiency, and experience with big data tools like Databricks and Airflow.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
550
-
🗓️ - Date
February 28, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Outside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
London, England, United Kingdom
-
🧠 - Skills detailed
#Data Engineering #ML (Machine Learning) #Big Data #Data Architecture #dbt (data build tool) #Scala #Monitoring #Observability #Data Pipeline #"ETL (Extract #Transform #Load)" #Python #Data Quality #Databricks #Data Governance #Compliance #Airflow #Data Science #Automation
Role description
Databricks Data Engineer
£500 - £550 per day
Outside IR35
We're partnering with a leading online retail company that's transforming the way data and real-time intelligence shape customer experiences. Their mission is to harness cutting-edge data and streaming technologies to drive smarter decisions, improve efficiency, and create personalised journeys for millions of shoppers worldwide.
The Role
As a Senior Data Engineer, you'll play a key role in developing and optimising the backbone of the company's data platform. You'll be responsible for building and maintaining large-scale, real-time data pipelines that power analytics, machine learning, and operational systems across the business.
You'll collaborate with software engineers, data scientists, and analytics teams to ensure the platform delivers reliable, high-quality, and compliant data at scale. This is a hands-on engineering role that blends software craftsmanship with data architecture expertise.
Key responsibilities:
• Model complex data sets using DBT
• Build and maintain scalable backend systems in Python or Scala, following clean code and testing principles.
• Develop tools and frameworks for data governance, privacy, and quality monitoring, ensuring full compliance with data protection standards.
• Create resilient data workflows and automation within, Databricks, and other modern big data ecosystems.
• Contribute to an engineering culture that values testing, peer reviews, and automation-first principles.
What You'll Bring
• Strong experience in DBT modelling
• Advanced proficiency in Python or Scala, with a solid grasp of software engineering fundamentals.
• Proven ability to design, deploy, and scale production-grade data platforms and backend systems.
• Familiarity with data governance frameworks, privacy compliance, and automated data quality checks.
• Hands-on experience with big data tools (Airflow, Databricks) and data observability platforms.
• Collaborative mindset and experience working with cross-functional teams including ML and analytics specialists.
• Curiosity and enthusiasm for continuous learning - you stay up to date with the latest tools and trends in data engineering and love sharing knowledge with others.
Please send your email
Databricks Data Engineer
£500 - £550 per day
Outside IR35
We're partnering with a leading online retail company that's transforming the way data and real-time intelligence shape customer experiences. Their mission is to harness cutting-edge data and streaming technologies to drive smarter decisions, improve efficiency, and create personalised journeys for millions of shoppers worldwide.
The Role
As a Senior Data Engineer, you'll play a key role in developing and optimising the backbone of the company's data platform. You'll be responsible for building and maintaining large-scale, real-time data pipelines that power analytics, machine learning, and operational systems across the business.
You'll collaborate with software engineers, data scientists, and analytics teams to ensure the platform delivers reliable, high-quality, and compliant data at scale. This is a hands-on engineering role that blends software craftsmanship with data architecture expertise.
Key responsibilities:
• Model complex data sets using DBT
• Build and maintain scalable backend systems in Python or Scala, following clean code and testing principles.
• Develop tools and frameworks for data governance, privacy, and quality monitoring, ensuring full compliance with data protection standards.
• Create resilient data workflows and automation within, Databricks, and other modern big data ecosystems.
• Contribute to an engineering culture that values testing, peer reviews, and automation-first principles.
What You'll Bring
• Strong experience in DBT modelling
• Advanced proficiency in Python or Scala, with a solid grasp of software engineering fundamentals.
• Proven ability to design, deploy, and scale production-grade data platforms and backend systems.
• Familiarity with data governance frameworks, privacy compliance, and automated data quality checks.
• Hands-on experience with big data tools (Airflow, Databricks) and data observability platforms.
• Collaborative mindset and experience working with cross-functional teams including ML and analytics specialists.
• Curiosity and enthusiasm for continuous learning - you stay up to date with the latest tools and trends in data engineering and love sharing knowledge with others.
Please send your email






