

twentyAI
Data Engineer - TWE45313
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (TWE45313) in London, with a 12-month contract and a pay rate of "£X per day". Key skills include Snowflake, Python, SQL, and dbt, with preferred experience in financial services and cloud platforms.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 23, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Scala #Scripting #SQL (Structured Query Language) #GCP (Google Cloud Platform) #Data Quality #Jira #Documentation #Data Extraction #AWS (Amazon Web Services) #dbt (data build tool) #Data Engineering #Data Processing #Data Pipeline #Agile #Azure #Scrum #"ETL (Extract #Transform #Load)" #Snowflake #Automation #Python #Cloud
Role description
Data Engineer (Snowflake / Python / SQL / dbt)
Location: London (2–3 days onsite/week)
Contract: 12 months (likely extension)
Overview
Seeking a Data Engineer with strong cloud data platform experience to support the build, optimisation, and delivery of scalable data solutions within a global financial services environment. Hands-on role covering engineering, transformation pipelines, data quality, and stakeholder engagement.
Responsibilities
• Design, build, and maintain scalable data pipelines and ETL/ELT workflows
• Develop and optimise data models within Snowflake
• Build and manage transformation layers using dbt
• Write efficient SQL for data extraction, transformation, and reporting
• Develop Python-based automation and data processing solutions
• Monitor pipeline performance, data quality, and system reliability
• Collaborate with analysts, product, and engineering teams on data requirements
• Provide clear stakeholder updates on delivery progress and risks
• Support Agile delivery practices (Scrum, Jira)
Skills
• Strong Snowflake engineering and performance optimisation
• Advanced SQL development and data modelling
• Python for data engineering, automation, and scripting
• Hands-on dbt experience (models, testing, documentation)
• ETL / ELT pipeline development experience
• Experience within financial services / regulated environments preferred
• Cloud platform exposure (AWS / Azure / GCP beneficial)
• Agile experience & stakeholder management
Data Engineer (Snowflake / Python / SQL / dbt)
Location: London (2–3 days onsite/week)
Contract: 12 months (likely extension)
Overview
Seeking a Data Engineer with strong cloud data platform experience to support the build, optimisation, and delivery of scalable data solutions within a global financial services environment. Hands-on role covering engineering, transformation pipelines, data quality, and stakeholder engagement.
Responsibilities
• Design, build, and maintain scalable data pipelines and ETL/ELT workflows
• Develop and optimise data models within Snowflake
• Build and manage transformation layers using dbt
• Write efficient SQL for data extraction, transformation, and reporting
• Develop Python-based automation and data processing solutions
• Monitor pipeline performance, data quality, and system reliability
• Collaborate with analysts, product, and engineering teams on data requirements
• Provide clear stakeholder updates on delivery progress and risks
• Support Agile delivery practices (Scrum, Jira)
Skills
• Strong Snowflake engineering and performance optimisation
• Advanced SQL development and data modelling
• Python for data engineering, automation, and scripting
• Hands-on dbt experience (models, testing, documentation)
• ETL / ELT pipeline development experience
• Experience within financial services / regulated environments preferred
• Cloud platform exposure (AWS / Azure / GCP beneficial)
• Agile experience & stakeholder management





