

Snowflake Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake Data Engineer on a 6-month contract in London (hybrid). Key skills include Snowflake, Terraform, and ETL/ELT workflows (Dagster or Airflow). Experience in commodities trading is preferred. Pay rate is up to £750 per day.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
750
-
🗓️ - Date discovered
September 2, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Hybrid
-
📄 - Contract type
Inside IR35
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Storage #AWS (Amazon Web Services) #Programming #Python #IAM (Identity and Access Management) #Snowflake #Data Engineering #Java #"ETL (Extract #Transform #Load)" #S3 (Amazon Simple Storage Service) #Data Access #Automation #Terraform #Airflow #Scala #Kafka (Apache Kafka) #AWS S3 (Amazon Simple Storage Service)
Role description
❄️ Snowflake Data Platform Engineer | Commodities Trading | London (Hybrid) | 6-Month Contract (Likely Extension) | Inside IR35
Our customer is a global leader in commodity markets, operating across multiple major trading hubs worldwide. Their technology teams deliver multi-asset-class commodity systems with a strong focus on automation, optimisation, and innovation.
Role Overview
We’re looking for a Snowflake Data Platform Engineer to play a hands-on role in building, optimising, and maintaining Snowflake-based data solutions. This is focused on implementation, optimisation, and operations, working closely with cross-functional teams to drive data-driven insights across the business.
Key Responsibilities
• Develop and maintain Snowflake data models, schemas, and pipelines
• Build/manage ETL/ELT workflows using Dagster or Airflow
• Automate infrastructure with Terraform
• Implement secure access controls (OAuth integration)
• Optimise Snowflake queries, storage, and resource usage for cost and performance
• Ingest and integrate data from multiple systems into Snowflake
• Monitor, troubleshoot, and maintain platform reliability
• Collaborate with developers, architects, and business teams to deliver scalable data solutions
Technical Expertise
• Strong hands-on experience with Snowflake (development + admin)
• Practical knowledge of Terraform and automation workflows
• Hands-on with Dagster or Airflow for orchestration
• Working knowledge of AWS (S3, IAM)
• Familiarity with OAuth and secure data access patterns
• Strong programming skills in Python and/or Java
• Solid grounding in data engineering, ETL, and warehousing concepts
Bonus Experience
• Denodo
• Kafka
• Commodities or capital markets knowledge
Contract Details
• Duration: 6 months (strong likelihood of extension)
• Location: London hybrid (3 days onsite minimum)
• IR35: Inside up to £750 per day Umbrella
❄️ Snowflake Data Platform Engineer | Commodities Trading | London (Hybrid) | 6-Month Contract (Likely Extension) | Inside IR35
Our customer is a global leader in commodity markets, operating across multiple major trading hubs worldwide. Their technology teams deliver multi-asset-class commodity systems with a strong focus on automation, optimisation, and innovation.
Role Overview
We’re looking for a Snowflake Data Platform Engineer to play a hands-on role in building, optimising, and maintaining Snowflake-based data solutions. This is focused on implementation, optimisation, and operations, working closely with cross-functional teams to drive data-driven insights across the business.
Key Responsibilities
• Develop and maintain Snowflake data models, schemas, and pipelines
• Build/manage ETL/ELT workflows using Dagster or Airflow
• Automate infrastructure with Terraform
• Implement secure access controls (OAuth integration)
• Optimise Snowflake queries, storage, and resource usage for cost and performance
• Ingest and integrate data from multiple systems into Snowflake
• Monitor, troubleshoot, and maintain platform reliability
• Collaborate with developers, architects, and business teams to deliver scalable data solutions
Technical Expertise
• Strong hands-on experience with Snowflake (development + admin)
• Practical knowledge of Terraform and automation workflows
• Hands-on with Dagster or Airflow for orchestration
• Working knowledge of AWS (S3, IAM)
• Familiarity with OAuth and secure data access patterns
• Strong programming skills in Python and/or Java
• Solid grounding in data engineering, ETL, and warehousing concepts
Bonus Experience
• Denodo
• Kafka
• Commodities or capital markets knowledge
Contract Details
• Duration: 6 months (strong likelihood of extension)
• Location: London hybrid (3 days onsite minimum)
• IR35: Inside up to £750 per day Umbrella