Animo Group

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer on a 6-month contract in London, requiring advanced SQL, dbt expertise, and cloud experience (preferably BigQuery). Strong data modeling skills and familiarity with CI/CD practices are essential. Hybrid work setup expected.
🌎 - Country
United Kingdom
πŸ’± - Currency
Β£ GBP
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
February 14, 2026
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Data Lifecycle #BigQuery #"ETL (Extract #Transform #Load)" #Data Warehouse #Consulting #Agile #GCP (Google Cloud Platform) #dbt (data build tool) #Dimensional Modelling #SQL (Structured Query Language) #Documentation #Data Science #Automated Testing #Data Transformations #Datasets #Data Engineering #Strategy #Security #Cloud #Data Security #Deployment
Role description
Data Engineer - dbt, SQL, BigQuery 6 Month Contract London (3 days in the office) About the role As a Data Engineer, you'll join our delivery teams to help build and evolve data platforms that enable analytics and downstream reporting. For this engagement, the focus is on the transform layerβ€”designing and migrating a large set of dbt models to a new modelling approach, primarily using SQL on BigQuery (GCP). You'll work in a collaborative, low-ego environment alongside experienced engineers and client stakeholders, contributing to a delivery approach that values automated testing, CI/CD, and iterative improvement. This is an opportunity to take ownership of complex data modelling work, improve the quality and usability of warehouse data, and support data consumers such as analysts and data scientists with trusted, well-validated datasets. What you'll be doing β€’ Build and maintain dbt models to transform data in BigQuery (GCP), aligning outputs to an updated data modelling strategy. β€’ Migrate and refactor an existing dbt estate (200+ models), reworking SQL to fit new schemas and patterns while keeping downstream consumption stable. β€’ Analyse source and existing models to understand business meaning, define the target shape, and implement clear, testable transformations. β€’ Apply strong data modelling fundamentals (e.g., dimensional modelling) to create reliable, well-structured datasets for dashboards and analytics users. β€’ Use CI/CD-friendly ways of working, including automated testing and promoting changes through environments (dev/QA and into production). β€’ Collaborate closely with client stakeholders and wider engineering teams, sharing knowledge and working with low ego in a multi-disciplinary delivery setup. β€’ Validate and demonstrate that models are correct and performant, iterating in an agile, incremental way to deliver value quickly and safely. What we're looking for β€’ Proven experience as a Data Engineer delivering production-grade data transformations, with strong ownership from development through release. β€’ Advanced SQL skills and strong data modelling fundamentals (e.g., dimensional modelling), with the ability to analyse requirements and translate them into clear, reliable models. β€’ Hands-on experience building and maintaining dbt projects (models, tests, documentation) and delivering well-validated transformations that downstream teams can trust. β€’ Familiarity with modern deployment ways of working, including CI/CD, automated testing, and promoting changes through non-production environments into production. β€’ Comfortable working with cloud data warehouses (BigQuery/GCP experience is a plus, but not essential), and able to pick up new platforms quickly where the underlying patterns are consistent. β€’ A collaborative, low-ego consultant mindset: able to work closely with client teams and stakeholders, share context across the team, and deliver iteratively in an agile environment. What you'll need β€’ Solid experience as a Data Engineer delivering production-grade data transformations across the data lifecycle (ingest, transform, serve), with the ability to focus primarily on the transformation layer when needed. β€’ Strong SQL skills and practical data modelling experience (e.g., dimensional modelling), with the ability to translate business requirements into well-structured, analytics-ready datasets. β€’ Hands-on experience building and maintaining dbt models, including testing, documentation, and version-controlled development workflows. β€’ Familiarity with cloud data warehouses and platforms (ideally BigQuery on GCP), and the ability to ramp up quickly in a new cloud environment. β€’ Understanding of CI/CD practices and safe deployment approaches (e.g., dev/QA/production workflows), plus a disciplined approach to automated testing and quality. β€’ Strong collaboration and communication skills, able to work effectively with stakeholders, data scientists/analysts, and other engineers to clarify requirements and share domain knowledge. β€’ Comfort working in iterative, agile waysβ€”showing progress early, incorporating feedback, and delivering in small, reliable increments. β€’ Awareness of data security and governance considerations, especially in regulated environments (e.g., financial services), including handling sensitive data appropriately. β€’ A low-ego, consulting-oriented mindset: proactive, pragmatic, and able to build trusted relationships with client teams. β€’ Willingness to work in a hybrid setup with an expectation of around three days per week on-site in central London.