Explore Group

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer in Central London, offering a 3-4 day onsite contract. Requires 7+ years in software development, 5+ years with data systems, and 2+ years in Azure. Key skills include Azure Data Factory, SQL, and Snowflake.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 14, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Data Pipeline #Databricks #Cloud #Scala #Data Warehouse #"ETL (Extract #Transform #Load)" #Azure SQL #SQL (Structured Query Language) #BI (Business Intelligence) #Datasets #Batch #Snowflake #Data Quality #Azure Data Factory #SQL Queries #Azure #ADF (Azure Data Factory) #Requirements Gathering #Storage #Data Engineering #Python #Infrastructure as Code (IaC)
Role description
Senior Data Engineer 📍 Central London (Onsite) We are working with a leading organisation in the European alternative asset management sector that is investing heavily in its next-generation data platform. They are looking for an experienced Senior Data Engineer to join a high-performing data team building a modern, enterprise-grade data platform that will power analytics, reporting, and data-driven decision-making across multiple business units. Both permanent and contract opportunities are available. Working Model • Contract: 3–4 days per week onsite in Central London • Permanent: 5 days per week onsite in Central London The Role You will play a key role in designing and building scalable data infrastructure and pipelines within a cloud-first Azure environment, helping to shape the architecture of a strategic data platform. Working closely with engineering, analytics, and business stakeholders, you’ll develop robust ETL/ELT pipelines and data warehouse solutions that enable advanced analytics and business intelligence across the organisation. Key Responsibilities • Design, build, and maintain scalable data pipelines using Azure and Snowflake • Develop and optimise ETL / ELT processes for batch and micro-batch workloads • Work extensively with Azure Data Factory, Azure SQL, Azure Storage, and Azure Functions • Design and maintain data warehouse models including fact and dimension tables using star and snowflake schemas • Apply best practices from Kimball and Inmon data warehousing methodologies • Write and optimise complex SQL queries to support analytics and reporting • Ensure data quality, reconciliation, and consistency across multiple sources • Collaborate with Business Intelligence teams to enable reporting and dashboards • Participate in technical design and requirements gathering • Contribute to data platform architecture decisions including performance and infrastructure design • Troubleshoot issues and continuously improve platform performance and reliability Requirements • 7+ years of experience in software development • 5+ years working with data-intensive systems • 2+ years of hands-on experience with cloud-based data platforms (Azure preferred) • Strong experience with the Azure Data Platform, including: • Azure Data Factory • Azure SQL • Azure Storage • Azure Functions • Advanced SQL expertise, including complex ETL development and data modelling • Experience building periodic and micro-batch data pipelines • Strong understanding of data warehouse architecture and loading strategies • 1+ year of hands-on Snowflake experience • Strong analytical and problem-solving skills with a focus on data quality • Experience working with large-scale enterprise datasets Nice to Have • Advanced Snowflake optimisation and performance tuning • Experience with Python and/or Databricks • Experience designing end-to-end data platform architectures • Exposure to enterprise BI platforms • Familiarity with CI/CD pipelines and Infrastructure as Code