

Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of over 6 months, offering a pay rate of "£X per hour." Work is remote, requiring 5+ years of data engineering experience, proficiency in Azure services, SQL, and Python, along with relevant certifications.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
284.5272727273
-
🗓️ - Date discovered
July 25, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Unknown
-
📄 - Contract type
Fixed Term
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Data Engineering #Storage #Automation #Forecasting #Data Pipeline #Cloud #Data Quality #Microsoft Power BI #Dimensional Modelling #CRM (Customer Relationship Management) #BI (Business Intelligence) #Data Wrangling #Computer Science #Metadata #Data Architecture #Monitoring #Compliance #Logging #dbt (data build tool) #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Airflow #Azure Data Factory #Batch #Snowflake #DevOps #Schema Design #Data Science #Databricks #Synapse #ML (Machine Learning) #AI (Artificial Intelligence) #GDPR (General Data Protection Regulation) #Azure #ADF (Azure Data Factory) #Classification #Security #Python #Datasets #Data Integration #Observability
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
The Role
This is an exciting time to join the V&A Technology Team. As the V&A prepares for the opening of two new museums and continues to evolve the Young V&A experience, data is becoming essential to both operational efficiency and digital innovation. We're investing in a multi-year Technology Transformation Programme, with data as a critical enabler across the organisation.
As one of the first dedicated data engineering roles at the V&A, you will play a foundational role in creating high-quality data pipelines, transforming data from CRM, HR, Finance, and Commercial systems into clean, trusted, and AI-ready datasets that support decision-making and future digital services.
What will you be doing?
1. Data Integration & Pipeline Development
• Design and implement robust data pipelines for ingesting data from enterprise systems such as CRM (e.g., Dynamics), HR and Finance
• Develop automated ETL/ELT processes using modern tools (e.g., Azure Data Factory, dbt, Synapse).
• Implement data validation, quality, and cleansing routines to ensure accuracy and consistency.
1. Data Transformation & Modelling
• Collaborate with the Data Architect to define and build canonical data models across domains.
• Create enriched and denormalised datasets to support BI tools (e.g., Power BI) and AI pipelines.
• Standardise metrics and KPIs to support cross-functional reporting.
1. AI & Analytics Enablement
• Prepare clean, labelled datasets to support future AI/ML use cases (e.g., forecasting, classification, segmentation).
• Enable self-service analytics by producing trusted datasets in the enterprise data platform.
• Collaborate with analysts and data scientists to prototype and productionise models.
1. Security & Compliance
• Ensure secure handling of personally identifiable and financial data, in compliance with GDPR.
• Implement access controls, encryption, and retention policies within data pipelines.
1. Data Infrastructure & Automation
• Ensure pipelines are modular, version-controlled, and integrated into CI/CD workflows.
• Optimise performance of data pipelines and storage across batch and incremental jobs.
• Implement metadata tagging, lineage tracking, and logging for observability.
Who are we looking for?
Job Specific Skills
• 5+ years of experience in data engineering or backend data development roles.
• Proven track record in building data pipelines that integrate enterprise applications (CRM, ERP, HRIS).
• Experience with cloud-native data services (Azure preferred: Data Factory, Synapse, Databricks).
• Proficiency in Microsoft Fabric, Databricks and Snowflake.
• Proficiency in SQL, Python, and data pipeline frameworks (e.g., dbt, Airflow, Prefect).
Core Skills
• Strong data wrangling and transformation skills, with a focus on data quality and performance.
• Understanding of dimensional modelling and star/snowflake schema design for BI tools.
• Ability to collaborate effectively with architects, analysts, and business stakeholders.
• Familiarity with DevOps practices (CI/CD, testing, monitoring) in a data context.
Qualification
• Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.
• Certifications preferred:
• Microsoft Certified: Azure Data Engineer Associate
• dbt Analytics Engineering Certification
• Databricks Certified Data Engineer
• ITIL 4 Foundation or equivalent operational framework
Behaviours
• Respects others’ expertise, time, perspectives, and contribution.
• Takes responsibility for delivering on actions, achieving high standards, and learning from mistakes.
• Open to change, new ideas, and suggestions; looks for opportunities for improvement and self-development.
• Works with others outside their own department in a collaborative, understanding, and engaging way.
What’s in it for you?
• 29 days of holiday + public holidays each year
• 5.5% employee pension contribution, 10% employer pension contribution (post-probation)
• Life assurance scheme (to value of 4 x annual salary)
• Family-friendly policies e.g. enhanced maternity + paid carers leave
• Free sanitary products for all employees across our sites
• Free entrance to many major museums and exhibitions
• Benefits platform offering discounts at major retailers
& many more
Full job description and list of benefits on our website
When do applications close?
10 Aug @ 23:59