

idpp
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead Data Engineer on a 12-month contract, paying £500 per day. Located in Central London, it requires expertise in SQL, ETL, dbt, and experience with large-scale datasets in financial systems. Hybrid working is offered.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
500
-
🗓️ - Date
February 13, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Fixed Term
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Data Pipeline #BigQuery #Data Warehouse #dbt (data build tool) #Scala #Cloud #Datasets #Data Science #Data Governance #Data Architecture #Data Engineering #Compliance #Migration #Airflow #Snowflake #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language)
Role description
Lead Data Analytics Engineer – Contract
£500 per day (Inside IR35)
12-month contract
Hybrid working – Central London
A major UK banking client is undertaking a large-scale data warehouse re-architecture and migration programme and is looking for an experienced Lead Analytics Engineer to join on a long-term contract.
This is a hands-on role where you will embed within an existing data team aligned to a key business domain (e.g. Payments, Borrowing, Finance) and take ownership of defining and building the next generation of core data assets and pipelines.
This is not a like-for-like migration. The programme is focused on simplifying architecture, reducing cost and duplication, and enabling analytics teams quickly through well-designed, scalable data models and zero-downtime cutovers.
You will:
• Embed within a cross-functional data team in a defined business area
• Design and build scalable, high-quality data models in a new warehouse architecture
• Translate complex business requirements into robust SQL-based modelling patterns
• Lead safe historical backfills and seamless migrations for downstream users
• Work closely with Analytics Engineers, Data Scientists, backend engineers, and the data platform team
• Contribute to data standards, governance, and best practice across the organisation
• Rapidly unblock analytics teams with trusted, well-modelled datasets
Technical Environment
• dbt (SQL-first modelling)
• Airflow orchestration
• BigQuery cloud data warehouse
• Data sourced from financial, transactional, and operational backend systems
• Combination of open-source, cloud, and internal tooling
Required Experience
• Strong track record in data modelling, ETL, and building scalable data pipelines
• Expert-level SQL and deep knowledge of data warehousing concepts
• Experience working with very large-scale datasets (terabyte/petabyte scale)
• Hands-on experience with dbt, Airflow, and BigQuery or Snowflake
• Experience modelling data from financial, transactional, or operational systems
• Proven ability to build highly reliable datasets with strong correctness guarantees
• Experience working closely with business or product teams to design data architecture for analytics and reporting
Soft Skills
• Comfortable operating in fast-moving, cross-functional teams
• Able to mentor and elevate Analytics Engineers and Data Scientists through strong data practices
• Collaborative approach with backend engineering and platform teams
• Strong understanding of data governance, integrity, and compliance
• Experience helping define and raise data standards across teams
Lead Data Analytics Engineer – Contract
£500 per day (Inside IR35)
12-month contract
Hybrid working – Central London
A major UK banking client is undertaking a large-scale data warehouse re-architecture and migration programme and is looking for an experienced Lead Analytics Engineer to join on a long-term contract.
This is a hands-on role where you will embed within an existing data team aligned to a key business domain (e.g. Payments, Borrowing, Finance) and take ownership of defining and building the next generation of core data assets and pipelines.
This is not a like-for-like migration. The programme is focused on simplifying architecture, reducing cost and duplication, and enabling analytics teams quickly through well-designed, scalable data models and zero-downtime cutovers.
You will:
• Embed within a cross-functional data team in a defined business area
• Design and build scalable, high-quality data models in a new warehouse architecture
• Translate complex business requirements into robust SQL-based modelling patterns
• Lead safe historical backfills and seamless migrations for downstream users
• Work closely with Analytics Engineers, Data Scientists, backend engineers, and the data platform team
• Contribute to data standards, governance, and best practice across the organisation
• Rapidly unblock analytics teams with trusted, well-modelled datasets
Technical Environment
• dbt (SQL-first modelling)
• Airflow orchestration
• BigQuery cloud data warehouse
• Data sourced from financial, transactional, and operational backend systems
• Combination of open-source, cloud, and internal tooling
Required Experience
• Strong track record in data modelling, ETL, and building scalable data pipelines
• Expert-level SQL and deep knowledge of data warehousing concepts
• Experience working with very large-scale datasets (terabyte/petabyte scale)
• Hands-on experience with dbt, Airflow, and BigQuery or Snowflake
• Experience modelling data from financial, transactional, or operational systems
• Proven ability to build highly reliable datasets with strong correctness guarantees
• Experience working closely with business or product teams to design data architecture for analytics and reporting
Soft Skills
• Comfortable operating in fast-moving, cross-functional teams
• Able to mentor and elevate Analytics Engineers and Data Scientists through strong data practices
• Collaborative approach with backend engineering and platform teams
• Strong understanding of data governance, integrity, and compliance
• Experience helping define and raise data standards across teams






