

Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer on a 6-month contract, paying up to £625/day, remote (UK-based). Key skills include dbt Cloud, Snowflake, SQL, and cloud platforms (AWS, GCP, Azure). DataVault 2.0 experience is a plus.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
625
-
🗓️ - Date discovered
July 24, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Remote
-
📄 - Contract type
Inside IR35
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
United Kingdom
-
🧠 - Skills detailed
#Documentation #GCP (Google Cloud Platform) #Data Engineering #GIT #Vault #Cloud #Data Architecture #Fivetran #AWS (Amazon Web Services) #Version Control #Snowflake #Scala #Azure #Tableau #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #dbt (data build tool) #Data Pipeline #Datasets
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Senior Data Engineer – Contract
📍 Location: Remote (UK-based)
💰 Rate: Up to £625/day
📆 Contract: 6 months
🧾 IR35 Status: Inside IR35
Overview
We’re looking for a Senior Data Engineer to join a growing data team and help build scalable data products using Snowflake, dbt Cloud, and the DataVault framework. This is a hands-on role where you'll design and build robust pipelines, apply modern modelling techniques, and contribute to shaping long-term data architecture.
Key Responsibilities
• Build and maintain end-to-end data pipelines using Snowflake and dbt Cloud
• Apply DataVault 2.0 patterns to create flexible, traceable data models
• Work closely with analysts, architects, and engineers to deliver clean, reliable datasets
• Set up and manage CI/CD workflows for dbt Cloud using Git
Requirements
Must-have
Strong hands-on knowledge of:
• dbt Cloud (development, testing, documentation, CI/CD)
• Snowflake (SQL, tuning, access control, cost management)
• Cloud platforms (AWS, GCP, or Azure)
• SQL for analytical modelling and transformation
• Git and version control best practices
Nice-to-have
• Experience with DataVault 2.0 (hubs, satellites, links)
• Exposure to tools like Fivetran, Tableau, dbt Metrics Layer