rmg digital

Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer (Contract) in London, office-based for 5 days a week, paying £600-800 daily. Key skills include Snowflake, Python, SQL, and experience with large datasets. Familiarity with lakehouse architecture and orchestration tools is required.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
800
-
🗓️ - Date
April 10, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Inside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Data Visualisation #Snowflake #Airflow #Cloud #Docker #Tableau #Data Pipeline #"ETL (Extract #Transform #Load)" #Data Engineering #dbt (data build tool) #Scala #Version Control #Microsoft Power BI #BI (Business Intelligence) #ML (Machine Learning) #GIT #Datasets #Data Science #SQL (Structured Query Language) #Python
Role description
Contract Senior Data Engineer - Snowflake London | Office-Based (5 Days per Week) | Inside IR35 £600-800 daily A highly regarded, forward-thinking European financial institution is seeking a Senior Data Engineer to join its Innovation & Technology team in London. This is an exciting opportunity to play a pivotal role in shaping a cutting-edge analytics platform used by sophisticated electronic trading clients. The Opportunity The successful candidate will join a dynamic Data Science team responsible for building the engine behind a proprietary Trade Intelligence Analytics platform. This platform delivers advanced insights into trade performance, empowering clients with end-to-end trade cost analysis via dashboards, data feeds, and machine learning-driven products. Working at the intersection of data engineering, analytics, and front-office trading, the role offers close collaboration with Data Scientists, Analytics Engineers, and an Electronic Trading Desk. It is a hands-on position with real impact on client-facing products. Key Responsibilities • Designing, developing, and scaling robust data pipelines using Python, SQL, and ideally Rust • Translating complex business requirements into efficient trade performance benchmarks • Building and optimising scalable data models within a modern lakehouse architecture • Developing APIs to enable client interaction with analytics products • Normalising and integrating data feeds to support platform adoption • Driving engineering excellence through best practices such as CI/CD, version control, and test-driven development • Contributing to front-end analytics and user experience improvements Candidate Profile • Demonstrable expertise in Snowflake architecture and development, with a strong track record of designing, optimising, and maintaining scalable data solutions within a hybrid on-prem / cloud-based environment • Strong expertise in Python and SQL, with experience building high-performance data pipelines • Familiarity with lakehouse architectures and modern data platforms • Proven experience handling large-scale, high-volume datasets • Experience with orchestration tools such as Airflow or Dagster • Advanced knowledge of DBT for data transformation and modelling • Hands-on experience with Docker/Podman, Git, and CI/CD workflows • Understanding of data visualisation tools such as Tableau or Power BI • Strong data modelling and architectural design capabilities, particularly for analytics and machine learning use cases The Environment This organisation fosters a culture of independent thinking, entrepreneurial spirit, and collaboration. While innovation is at its core, it places strong value on in-person collaboration, recognising the importance of shared ideas, strong relationships, and team cohesion.