

Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer focused on Snowflake platform build, based in London (2 days onsite). Contract length is 6 months, with potential for 18–24 months. Key skills include Snowflake implementation, Python, PySpark, and AWS experience.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
-
🗓️ - Date discovered
June 4, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
On-site
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#AWS (Amazon Web Services) #"ETL (Extract #Transform #Load)" #Storage #Python #Cloud #PySpark #Compliance #dbt (data build tool) #Data Lake #Spark (Apache Spark) #Databricks #Data Engineering #DevOps #Snowflake
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Senior Data Engineer – Snowflake Platform Build
Location: London (2 days a week on site)
Contract: 6 month sign on (18–24 month project)
Interview process: 2 stages
twentyAI's customer is building a next version of their data platform, with Snowflake at the core. The PoC is in place — now they need someone who’s been through this before to lead the implementation and help scale the platform across the business.
The project involves migrating our current on-prem Data Lake to Snowflake, while keeping storage on a private cloud. You’ll be setting the foundations: building out ETL pipelines, establishing best practices, and working closely with teams across trading, finance, compliance, and ops.
Profile:
• Strong experience implementing Snowflake in a lead or senior capacity
• Solid background in Python, PySpark, and Spark
• Hands-on with platform setup – ideally with a DevOps-first approach
• Exposure to AWS environments
• Experience working with data from trading platforms or within commodities, banking, or financial services
Tech environment:
• Primary Platform: Snowflake
• Other Tech: DBT, Databricks, Spark, PySpark, Python
• Cloud: AWS (preferred), Private Cloud storage
• Data Sources: Financial/trading systems