

Harnham
Databricks Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Databricks Data Engineer, offering £500 - £600 per day for a contract length of "unknown." Key skills include streaming technologies (Kafka, Flink), Python or Scala proficiency, and experience with big data tools like Airflow and Databricks.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
600
-
🗓️ - Date
October 24, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Outside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
London, England, United Kingdom
-
🧠 - Skills detailed
#Databricks #Data Engineering #Big Data #Data Architecture #Data Governance #"ETL (Extract #Transform #Load)" #Kafka (Apache Kafka) #Compliance #Automation #Monitoring #Observability #Python #Data Science #ML (Machine Learning) #Scala #Data Quality #Airflow #Data Pipeline
Role description
Databricks Data Engineer
£500 - £600 per day
Outside IR35
We're partnering with a leading online retail company that's transforming the way data and real-time intelligence shape customer experiences. Their mission is to harness cutting-edge data and streaming technologies to drive smarter decisions, improve efficiency, and create personalised journeys for millions of shoppers worldwide.
The Role
As a Senior Data Engineer, you'll play a key role in developing and optimising the backbone of the company's data platform. You'll be responsible for building and maintaining large-scale, real-time data pipelines that power analytics, machine learning, and operational systems across the business.
You'll collaborate with software engineers, data scientists, and analytics teams to ensure the platform delivers reliable, high-quality, and compliant data at scale. This is a hands-on engineering role that blends software craftsmanship with data architecture expertise.
Key responsibilities:
• Design and implement high-throughput data streaming solutions using Kafka, Flink, or Confluent.
• Build and maintain scalable backend systems in Python or Scala, following clean code and testing principles.
• Develop tools and frameworks for data governance, privacy, and quality monitoring, ensuring full compliance with data protection standards.
• Create resilient data workflows and automation within Airflow, Databricks, and other modern big data ecosystems.
• Implement and manage data observability and cataloguing tools (e.g., Monte Carlo, Atlan, DataHub) to enhance visibility and reliability.
• Partner with ML engineers, analysts, and analytics engineers to understand their data needs and enable advanced data use cases.
• Contribute to an engineering culture that values testing, peer reviews, and automation-first principles.
What You'll Bring
• Strong experience in streaming technologies such as Kafka, Flink, or Confluent.
• Advanced proficiency in Python or Scala, with a solid grasp of software engineering fundamentals.
• Proven ability to design, deploy, and scale production-grade data platforms and backend systems.
• Familiarity with data governance frameworks, privacy compliance, and automated data quality checks.
• Hands-on experience with big data tools (Airflow, Databricks) and data observability platforms.
• Collaborative mindset and experience working with cross-functional teams including ML and analytics specialists.
• Curiosity and enthusiasm for continuous learning - you stay up to date with the latest tools and trends in data engineering and love sharing knowledge with others.
Please send your email
Databricks Data Engineer
£500 - £600 per day
Outside IR35
We're partnering with a leading online retail company that's transforming the way data and real-time intelligence shape customer experiences. Their mission is to harness cutting-edge data and streaming technologies to drive smarter decisions, improve efficiency, and create personalised journeys for millions of shoppers worldwide.
The Role
As a Senior Data Engineer, you'll play a key role in developing and optimising the backbone of the company's data platform. You'll be responsible for building and maintaining large-scale, real-time data pipelines that power analytics, machine learning, and operational systems across the business.
You'll collaborate with software engineers, data scientists, and analytics teams to ensure the platform delivers reliable, high-quality, and compliant data at scale. This is a hands-on engineering role that blends software craftsmanship with data architecture expertise.
Key responsibilities:
• Design and implement high-throughput data streaming solutions using Kafka, Flink, or Confluent.
• Build and maintain scalable backend systems in Python or Scala, following clean code and testing principles.
• Develop tools and frameworks for data governance, privacy, and quality monitoring, ensuring full compliance with data protection standards.
• Create resilient data workflows and automation within Airflow, Databricks, and other modern big data ecosystems.
• Implement and manage data observability and cataloguing tools (e.g., Monte Carlo, Atlan, DataHub) to enhance visibility and reliability.
• Partner with ML engineers, analysts, and analytics engineers to understand their data needs and enable advanced data use cases.
• Contribute to an engineering culture that values testing, peer reviews, and automation-first principles.
What You'll Bring
• Strong experience in streaming technologies such as Kafka, Flink, or Confluent.
• Advanced proficiency in Python or Scala, with a solid grasp of software engineering fundamentals.
• Proven ability to design, deploy, and scale production-grade data platforms and backend systems.
• Familiarity with data governance frameworks, privacy compliance, and automated data quality checks.
• Hands-on experience with big data tools (Airflow, Databricks) and data observability platforms.
• Collaborative mindset and experience working with cross-functional teams including ML and analytics specialists.
• Curiosity and enthusiasm for continuous learning - you stay up to date with the latest tools and trends in data engineering and love sharing knowledge with others.
Please send your email






