

Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer on a 6-month contract in London (Hybrid). Key skills include Snowflake, dbt, SQL, and Python. Proven experience in cloud environments and data engineering solutions is essential.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
-
🗓️ - Date discovered
September 17, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Hybrid
-
📄 - Contract type
Fixed Term
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#dbt (data build tool) #Scripting #Snowflake #Azure #BI (Business Intelligence) #Automation #Data Pipeline #Azure Data Factory #Data Engineering #ADF (Azure Data Factory) #SQL (Structured Query Language) #Cloud #Scala #Data Quality #Python #Storage #"ETL (Extract #Transform #Load)" #Complex Queries
Role description
Contract Senior Data Engineer
London (Fleet Street) | Hybrid | 6 months
We’re Legal 500 a global leader in research and data on the legal services industry. Our platform powers decision-making for law firms and clients worldwide, combining 60,000+ annual data submissions with insights from 300,000+ client interviews.
We’re seeking a Contract Senior Data Engineer to join our London team and deliver scalable, high-impact data solutions. This is a hands-on role where you’ll need to hit the ground running with Snowflake and dbt.
What you’ll do
• Design and implement data pipelines across ingestion, transformation, and presentation.
• Work directly with Snowflake and dbt to deliver optimised, production-ready models
• Build and maintain ELT processes (SQL, Python, Azure Data Factory, APIs).
• Optimise performance within Snowflake (queries, compute, storage).
• Ensure governance, access control, and data quality across the platform.
• Collaborate closely with BI, engineering, and product teams.
What we’re looking for
• Proven track record as a Senior Data Engineer in a modern cloud environment.
• Expert in Snowflake (design, optimisation, advanced SQL).
• Strong experience with dbt for transformations and modelling.
• Skilled in SQL and Python (automation, scripting, integration).
• Familiar with Azure Data Factory / Functions and APIs for integration.
• Comfortable joining a team where systems and processes are still evolving.
Essential Criteria
• Proven, hands-on expertise with Snowflake (architecture, modelling, optimisation).
• Production experience with dbt for transformations and data modelling.
• Advanced SQL skills (complex queries, performance tuning, stored procedures).
• Strong Python for automation, scripting, and integration.
• Familiarity with Azure Data Factory / Functions for orchestration and integration.
• Track record of delivering end-to-end data engineering solutions in cloud environments.
Contract Senior Data Engineer
London (Fleet Street) | Hybrid | 6 months
We’re Legal 500 a global leader in research and data on the legal services industry. Our platform powers decision-making for law firms and clients worldwide, combining 60,000+ annual data submissions with insights from 300,000+ client interviews.
We’re seeking a Contract Senior Data Engineer to join our London team and deliver scalable, high-impact data solutions. This is a hands-on role where you’ll need to hit the ground running with Snowflake and dbt.
What you’ll do
• Design and implement data pipelines across ingestion, transformation, and presentation.
• Work directly with Snowflake and dbt to deliver optimised, production-ready models
• Build and maintain ELT processes (SQL, Python, Azure Data Factory, APIs).
• Optimise performance within Snowflake (queries, compute, storage).
• Ensure governance, access control, and data quality across the platform.
• Collaborate closely with BI, engineering, and product teams.
What we’re looking for
• Proven track record as a Senior Data Engineer in a modern cloud environment.
• Expert in Snowflake (design, optimisation, advanced SQL).
• Strong experience with dbt for transformations and modelling.
• Skilled in SQL and Python (automation, scripting, integration).
• Familiar with Azure Data Factory / Functions and APIs for integration.
• Comfortable joining a team where systems and processes are still evolving.
Essential Criteria
• Proven, hands-on expertise with Snowflake (architecture, modelling, optimisation).
• Production experience with dbt for transformations and data modelling.
• Advanced SQL skills (complex queries, performance tuning, stored procedures).
• Strong Python for automation, scripting, and integration.
• Familiarity with Azure Data Factory / Functions for orchestration and integration.
• Track record of delivering end-to-end data engineering solutions in cloud environments.