

Robert Half
Azure Snowflake Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is an Azure Snowflake Data Engineer for an initial 6-month contract, hybrid in London, paying £450–£500 per day. Requires financial services experience, expertise in Snowflake, dbt, Azure DevOps, and strong SQL skills.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
520
-
🗓️ - Date
October 25, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Inside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Azure Data Factory #Scala #Consulting #ADF (Azure Data Factory) #dbt (data build tool) #Snowflake #Cloud #Data Warehouse #YAML (YAML Ain't Markup Language) #GIT #Azure DevOps #Data Pipeline #Automation #Documentation #Data Quality #Scripting #Data Engineering #DevOps #Databricks #SQL (Structured Query Language) #Azure #Macros #Python
Role description
Snowflake Data Engineer | Robert Half London (Hybrid)
Robert Half International (an S&P 500 global staffing provider) is supporting a global consulting firm in sourcing an interim Snowflake Data Engineer. This role will help design and build a new cloud data warehouse solution using Snowflake, dbt, and Azure DevOps, delivering scalable ELT pipelines and optimised data models to support business-wide analytics and reporting.
Assignment Details
• Initial 6-month contract (extensions expected)
• Hybrid – 2 days per week on-site in the City of London
• £450–£500 per day PAYE (plus 12.07% holiday pay – PAYE with employer’s NI & Tax deducted at source, unlike umbrella companies and no umbrella company admin fees)
Key Skills & Experience
• Financial Services will be essential for this opportunity
• Proven experience building and managing Snowflake Cloud Data Warehouses (warehouses, roles, RBAC, performance tuning, cost optimisation).
• Strong hands-on experience with dbt (modelling, incremental loads, macros, testing, documentation).
• Solid understanding of Azure DevOps for CI/CD pipelines (YAML, environments, approvals, Git branching).
• Excellent SQL engineering and data modelling (star/snowflake schemas, best practices).
• Experience in data pipeline development, data quality, and automation.
• Clear communicator with strong stakeholder engagement across technical and business teams.
Nice to Have
• Exposure to Azure Data Factory, Databricks, or other orchestration tools.
• Python scripting for ELT automation and testing.
• Experience working in Financial Services or regulated industries.
All candidates must complete standard screening (Right to Work, DBS, credit/sanctions, employment verification).
This is a great opportunity to join a high-performing data team and deliver a modern, scalable Snowflake platform leveraging dbt and Azure DevOps best practices.
Snowflake Data Engineer | Robert Half London (Hybrid)
Robert Half International (an S&P 500 global staffing provider) is supporting a global consulting firm in sourcing an interim Snowflake Data Engineer. This role will help design and build a new cloud data warehouse solution using Snowflake, dbt, and Azure DevOps, delivering scalable ELT pipelines and optimised data models to support business-wide analytics and reporting.
Assignment Details
• Initial 6-month contract (extensions expected)
• Hybrid – 2 days per week on-site in the City of London
• £450–£500 per day PAYE (plus 12.07% holiday pay – PAYE with employer’s NI & Tax deducted at source, unlike umbrella companies and no umbrella company admin fees)
Key Skills & Experience
• Financial Services will be essential for this opportunity
• Proven experience building and managing Snowflake Cloud Data Warehouses (warehouses, roles, RBAC, performance tuning, cost optimisation).
• Strong hands-on experience with dbt (modelling, incremental loads, macros, testing, documentation).
• Solid understanding of Azure DevOps for CI/CD pipelines (YAML, environments, approvals, Git branching).
• Excellent SQL engineering and data modelling (star/snowflake schemas, best practices).
• Experience in data pipeline development, data quality, and automation.
• Clear communicator with strong stakeholder engagement across technical and business teams.
Nice to Have
• Exposure to Azure Data Factory, Databricks, or other orchestration tools.
• Python scripting for ELT automation and testing.
• Experience working in Financial Services or regulated industries.
All candidates must complete standard screening (Right to Work, DBS, credit/sanctions, employment verification).
This is a great opportunity to join a high-performing data team and deliver a modern, scalable Snowflake platform leveraging dbt and Azure DevOps best practices.






