

Tenth Revolution Group
AWS/Snowflake Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS/Snowflake Data Engineer with 5+ years of experience in SQL Server, Snowflake, AWS, and Python. It is a 6-month contract based in London (hybrid). Daily rate is £500, focusing on data integration and orchestration tools.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
500
-
🗓️ - Date
March 21, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Outside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Consulting #SQL (Structured Query Language) #Kubernetes #S3 (Amazon Simple Storage Service) #Data Pipeline #Airflow #Snowflake #AI (Artificial Intelligence) #Data Engineering #Terraform #Cloud #AWS (Amazon Web Services) #Python #SQL Server #Data Integration
Role description
I'm currently working with a client in the consulting sector looking for a contract AWS Data engineer with strong Snowflake experience.
What You’ll Do:
Build and maintain data pipelines using SQL Server and Snowflake.
Support AWS infrastructure work with tools such as Terraform, Kubernetes, and S3.
Develop orchestration workflows using Dagster, Prefect, or Airflow.
Contribute to data product development, including data mesh concepts.
Assist in integrating data with AI‑focused tools such as Snowflake Bedrock and Cortex AI.
Required skillset:
5+ years hands on experience with SQL Server, Snowflake, AWS, Python
Background in delivering data products end‑to‑end.
Familiarity with modern orchestration tools.
Understanding of cloud and AI‑related data integrations.
Clear communication and collaborative working style.
Location - London (hybrid - 3 days onsite)
Contract - 6 months
Daily rate - £500 outside IR35
I'm currently working with a client in the consulting sector looking for a contract AWS Data engineer with strong Snowflake experience.
What You’ll Do:
Build and maintain data pipelines using SQL Server and Snowflake.
Support AWS infrastructure work with tools such as Terraform, Kubernetes, and S3.
Develop orchestration workflows using Dagster, Prefect, or Airflow.
Contribute to data product development, including data mesh concepts.
Assist in integrating data with AI‑focused tools such as Snowflake Bedrock and Cortex AI.
Required skillset:
5+ years hands on experience with SQL Server, Snowflake, AWS, Python
Background in delivering data products end‑to‑end.
Familiarity with modern orchestration tools.
Understanding of cloud and AI‑related data integrations.
Clear communication and collaborative working style.
Location - London (hybrid - 3 days onsite)
Contract - 6 months
Daily rate - £500 outside IR35






