

Harnham
Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer, offering £500 - £560 per day for a contract length of unspecified duration. Located in London with hybrid work (1 day in-office), it requires strong skills in DBT, Airflow, Snowflake, and Python, along with experience in automated testing and CI/CD.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 9, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
London
-
🧠 - Skills detailed
#Scala #Automated Testing #Forecasting #Data Engineering #Deployment #Snowflake #Web Scraping #Automation #Airflow #AI (Artificial Intelligence) #Data Architecture #Python #Cloud #Data Pipeline #Data Ingestion #"ETL (Extract #Transform #Load)" #dbt (data build tool) #Version Control #Data Extraction #Data Quality
Role description
Data Engineer
£500 - £560 per day
London - 1 day per week in office
We're working with a leading global healthcare technology company who are building out their next-generation data platform, with a strong emphasis on automation, testing, and cloud-native engineering, and are looking for an experienced Data Engineer to join their team. The Role
You'll be part of a modern data engineering function that's implementing best-in-class data practices across ingestion, transformation, and orchestration layers. The environment is highly technical, collaborative, and fast-paced, giving you the opportunity to work across a wide variety of data sources and tools.
Day-to-day responsibilities include:
• Designing and developing DBT models and Airflow pipelines within a modern data stack.
• Building robust data ingestion pipelines across multiple sources - including external partners, internal platforms, and APIs.
• Implementing automated testing and CI/CD pipelines for data workflows.
• Performing data extraction and enrichment, including web scraping and parsing of unstructured text (e.g., scanned forms and documents).
• Collaborating on forecasting and predictive analytics initiatives.
• Bringing modern engineering practices, testing frameworks, and design patterns to the wider data function. Tech Stack & Skills
Core skills:
• Strong experience with DBT, Airflow, Snowflake, and Python
• Proven background in automated testing, CI/CD, and test-driven development
• Experience building and maintaining data pipelines and APIs in production environments
Nice to have:
• Knowledge of Snowflake infrastructure and data architecture design
• Experience using LLMs or MLOps frameworks for data extraction or model training
• Familiarity with cloud-agnostic deployments and version control best practices What You'll Bring
• A proactive, hands-on approach to engineering challenges
• A passion for data quality, scalability, and performance
• The ability to influence best practices and introduce modern standards across a data estate
• Strong problem-solving skills and the confidence to work across multiple complex data sources Why Join?
This is an opportunity to help shape the data foundations of a high-impact healthcare technology business - one that's actively exploring the intersection of data engineering, MLOps, and AI.
You'll have ownership of end-to-end data workflows, work with a world-class tech stack, and join a forward-thinking team that values automation, collaboration, and innovation.
Data Engineer
£500 - £560 per day
London - 1 day per week in office
We're working with a leading global healthcare technology company who are building out their next-generation data platform, with a strong emphasis on automation, testing, and cloud-native engineering, and are looking for an experienced Data Engineer to join their team. The Role
You'll be part of a modern data engineering function that's implementing best-in-class data practices across ingestion, transformation, and orchestration layers. The environment is highly technical, collaborative, and fast-paced, giving you the opportunity to work across a wide variety of data sources and tools.
Day-to-day responsibilities include:
• Designing and developing DBT models and Airflow pipelines within a modern data stack.
• Building robust data ingestion pipelines across multiple sources - including external partners, internal platforms, and APIs.
• Implementing automated testing and CI/CD pipelines for data workflows.
• Performing data extraction and enrichment, including web scraping and parsing of unstructured text (e.g., scanned forms and documents).
• Collaborating on forecasting and predictive analytics initiatives.
• Bringing modern engineering practices, testing frameworks, and design patterns to the wider data function. Tech Stack & Skills
Core skills:
• Strong experience with DBT, Airflow, Snowflake, and Python
• Proven background in automated testing, CI/CD, and test-driven development
• Experience building and maintaining data pipelines and APIs in production environments
Nice to have:
• Knowledge of Snowflake infrastructure and data architecture design
• Experience using LLMs or MLOps frameworks for data extraction or model training
• Familiarity with cloud-agnostic deployments and version control best practices What You'll Bring
• A proactive, hands-on approach to engineering challenges
• A passion for data quality, scalability, and performance
• The ability to influence best practices and introduce modern standards across a data estate
• Strong problem-solving skills and the confidence to work across multiple complex data sources Why Join?
This is an opportunity to help shape the data foundations of a high-impact healthcare technology business - one that's actively exploring the intersection of data engineering, MLOps, and AI.
You'll have ownership of end-to-end data workflows, work with a world-class tech stack, and join a forward-thinking team that values automation, collaboration, and innovation.