Data Freelance Hub

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer on a freelance basis, with a contract length of "unknown", offering a pay rate of "unknown". Key skills include 5+ years in Data Engineering, real-time data pipelines, SQL, Snowflake, and cloud platforms (AWS or Azure).
🌎 - Country
United Kingdom
πŸ’± - Currency
Β£ GBP
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
May 16, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Snowflake #AWS (Amazon Web Services) #Kafka (Apache Kafka) #Scala #Data Engineering #Datasets #Azure #Data Pipeline #Airflow #Automation #BI (Business Intelligence) #Cloud #dbt (data build tool) #Python #Programming #Data Quality #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language)
Role description
MasentΓ³ are urgently looking for a Data Engineer on a freelance basis to support the continued build-out of our clients' modern data platform following major digital transformation initiatives across customer experience and operations. You will play a key role in designing and delivering scalable data pipelines and data models, working with cloud-based platforms to support analytics, reporting, and business decision-making across a fast-paced global retail environment. Key Responsibilities: β€’ Design, build and optimise data pipelines and ETL processes β€’ Develop and maintain data models within Snowflake β€’ Work with large, complex datasets (customer, sales, operational data) β€’ Ensure data quality, performance, and reliability β€’ Support analytics, reporting, and BI use cases β€’ Collaborate with data, product, and business stakeholders β€’ Contribute to data platform improvements and automation Required Skills & Experience β€’ Strong Data Engineering experience (5+ years preferred) β€’ You must have experience of dealing with real-time or event-driven data pipelines β€’ You must have experience with tools like Airflow / dbt / Kafka β€’ Excellent SQL skillset (essential) β€’ Hands-on experience with Snowflake β€’ Experience building data pipelines / ETL / ELT processes β€’ Strong experience with cloud platforms (AWS or Azure) β€’ Programming in Python (or similar) β€’ Experience with data modelling (star/snowflake schemas) β€’ Familiar with performance tuning, optimisation, and automation β€’ Strong stakeholder communication skills If this matches your experience, then please apply today for further information.