

Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer on a 6-month contract, paying up to £625/day, remote (UK-based). Key skills include dbt Cloud, Snowflake, and data migration experience. Familiarity with Talend, Jenkins, and CI/CD processes is essential.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
625
-
🗓️ - Date discovered
July 24, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Remote
-
📄 - Contract type
Inside IR35
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
United Kingdom
-
🧠 - Skills detailed
#Documentation #Data Engineering #GIT #Airflow #Terraform #Jenkins #Migration #Cloud #Fivetran #Version Control #Snowflake #Compliance #Data Governance #GitHub #Scala #Tableau #SQL (Structured Query Language) #dbt (data build tool) #Talend #GDPR (General Data Protection Regulation) #Data Pipeline
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Senior Data Engineer – Contract
📍 Location: Remote (UK-based)
💰 Rate: Up to £625/day
📆 Contract: 6 months
🧾 IR35 Status: Inside IR35
Overview
We’re hiring a Senior Data Engineer to lead the migration of legacy pipelines (Talend, Jenkins) to a modern data stack using dbt Cloud and Snowflake. This is a hands-on role focused on scalability, performance, and best practices. You'll work closely with data engineers, analysts, and business teams.
Key Responsibilities
• Migrate data pipelines from Talend/Jenkins to dbt Cloud and Snowflake
• Translate existing business logic into modular dbt models
• Refactor legacy workflows for scalability and maintainability
• Implement CI/CD for dbt Cloud using Git-based workflows
Requirements
Must-have
• Proven experience in data platform migrations
• Strong hands-on experience with:
• dbt Cloud (modular builds, CI/CD, testing, documentation)
• Snowflake (SQL, tuning, access control, cost optimisation)
• Understanding of Talend and Jenkins and common migration pitfalls
• Comfortable with Git and version control workflows
• Familiarity with orchestration tools (e.g., Airflow)
Nice-to-have
• Exposure to Fivetran, Tableau, or dbt Metrics Layer
• Experience with CI/CD tooling (e.g., GitHub Actions, Terraform)
• Knowledge of data governance and compliance frameworks (e.g., GDPR, SOC 2)