Robert Half

DBT Data Engineer (Snowflake)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a DBT Data Engineer (Snowflake) on an initial 6-month contract, hybrid in London, paying £400–450 p/day. Requires strong Snowflake and dbt experience, preferably in Insurance/Reinsurance, and excellent SQL skills.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
520
-
🗓️ - Date
May 15, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Inside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Databricks #Airflow #dbt (data build tool) #SQL (Structured Query Language) #Azure DevOps #Snowflake #Deployment #Data Engineering #Automation #Azure Data Factory #AWS (Amazon Web Services) #Cloud #Azure #"ETL (Extract #Transform #Load)" #Scala #DevOps #Data Pipeline #Consulting #Data Warehouse #Python #ADF (Azure Data Factory) #Documentation
Role description
Robert Half International (an S&P 500 global staffing provider) is supporting a global consulting firm in sourcing an interim DBT Data Engineer. This role will focus on designing, delivering and supporting a modern cloud data warehouse using Snowflake and dbt, building scalable ELT pipelines and high-quality data models to support business-wide analytics and reporting. Assignment Details • Initial 6-month contract (extensions expected) • Hybrid – 2 days per week on-site in the City of London • £400–450 p/day + 12.07% hol pay (PAYE with employer’s NI & Tax deducted at source, unlike umbrella companies and no umbrella admin fees) • Start Date: c. 1–2 weeks turnaround Key Skills & Experience • Insurance / Reinsurance experience strongly preferred but broader Financial Services experience will also be considered • Strong hands-on experience building and supporting data solutions in Snowflake (ELT pipelines, performance optimisation, data modelling) • Proven experience with dbt including models, incremental loads, hooks, testing, snapshots and documentation • Good operational understanding of dbt model failures, notifications, troubleshooting and production support workflows • Excellent SQL engineering and data modelling experience (star/snowflake schemas, best practices) • Experience designing and supporting end-to-end data pipelines (ingestion, transformation, validation) • Familiarity with Snowflake features including Time Travel, cloning and RBAC • Experience working in cloud-based data environments (Azure preferred, AWS also considered) • Familiarity with CI/CD pipelines and deployment processes (Azure DevOps or similar) • Strong communication skills and ability to work across technical and business stakeholders Nice to have • Experience with orchestration tools (Airflow, Azure Data Factory or similar) • Python for automation, testing or data engineering support • Exposure to Databricks or modern Lakehouse environments All candidates must complete standard screening (Right to Work, DBS, credit/sanctions, employment verification).