Robert Half

Data Engineer – Snowflake / DBT (Financial Services)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer – Snowflake / DBT in Financial Services, offering a 6-month contract at £400–450 p/day. Requires strong Snowflake and dbt experience, SQL proficiency, and familiarity with cloud environments. Hybrid work in London.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
520
-
🗓️ - Date
April 30, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Inside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#ADF (Azure Data Factory) #Azure Data Factory #Python #Data Pipeline #Consulting #DevOps #Snowflake #Azure DevOps #AWS (Amazon Web Services) #Data Engineering #Airflow #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Scala #Azure #Documentation #Automation #Data Warehouse #Cloud #dbt (data build tool)
Role description
Robert Half International (an S&P 500 global staffing provider) is supporting a global consulting firm in sourcing an interim Snowflake Data Engineer. This role will focus on designing and delivering a modern cloud data warehouse using Snowflake and dbt, building scalable ELT pipelines and high-quality data models to support business-wide analytics and reporting. Assignment Details • Initial 6-month contract (extensions expected) • Hybrid – 2 days per week on-site in the City of London • £400–450 p/day + 12.07% hol pay (PAYE with employer’s NI & Tax deducted at source, unlike umbrella companies and no umbrella admin fees) • Start Date: c. 1–2 weeks turnaround Key Skills & Experience • Insurance / Reinsurance experience strongly preferred but data engineers with with broader Financial Services experience will also be considered • Strong hands-on experience building data solutions in Snowflake (ELT pipelines, performance optimisation, data modelling) • Proven experience with dbt (modelling, incremental loads, testing, documentation) • Excellent SQL engineering and data modelling (star/snowflake schemas, best practices) • Experience designing and delivering end-to-end data pipelines (ingestion, transformation, validation) • Familiarity with CI/CD pipelines (Azure DevOps or similar) • Experience working in cloud-based data environments (Azure or AWS) • Strong communication skills and ability to work across technical and business stakeholders Nice to have • Experience with orchestration tools (Airflow, Azure Data Factory or similar) • Python for data engineering, automation or testing All candidates must complete standard screening (Right to Work, DBS, credit/sanctions, employment verification).