Robert Half

Data Engineer - DBT Azure Snowflake

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer specializing in Snowflake, dbt, and Azure DevOps, offering a 6-month contract at £400-450 per day. Key requirements include financial services experience, strong SQL skills, and expertise in data pipeline development.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
480
-
🗓️ - Date
March 21, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Inside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#SQL (Structured Query Language) #YAML (YAML Ain't Markup Language) #ADF (Azure Data Factory) #Data Quality #Scala #Snowflake #Azure DevOps #Azure #Data Pipeline #Data Warehouse #Macros #Data Engineering #Python #Automation #Documentation #Consulting #GIT #Scripting #dbt (data build tool) #Databricks #Azure Data Factory #Cloud #DevOps
Role description
Robert Half International (an S&P 500 global staffing provider) is supporting a global consulting firm in sourcing an interim Snowflake Data Engineer. This role will help design and build a new cloud data warehouse solution using Snowflake, dbt, and Azure DevOps, delivering scalable ELT pipelines and optimised data models to support business-wide analytics and reporting. Assignment Details • Initial 6-month contract (extensions expected) • Hybrid – 2 days per week on-site in the City of London • £400-450 p/day plus 12.07% holiday pay via PAYE with employer’s NI & Tax deducted at source, unlike umbrella companies and no umbrella company admin fees) • Start Date: c. 1-2 weeks turnaround with anticipated start on w/c 6th April Key Skills & Experience • Financial Services will be essential for this opportunity. Ideally Insurance. • Proven experience building and managing Snowflake Cloud Data Warehouses (warehouses, roles, RBAC, performance tuning, cost optimisation). • Strong hands-on experience with dbt (modelling, incremental loads, macros, testing, documentation). • Solid understanding of Azure DevOps for CI/CD pipelines (YAML, environments, approvals, Git branching). • Excellent SQL engineering and data modelling (star/snowflake schemas, best practices). • Experience in data pipeline development, data quality, and automation. • Clear communicator with strong stakeholder engagement across technical and business teams. Nice to have: • Exposure to Azure Data Factory, Databricks, or other orchestration tools. • Python scripting for ELT automation and testing. All candidates must complete standard screening (Right to Work, DBS, credit/sanctions, employment verification).