

Atrium (EMEA)
Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer on an initial 3-month contract, remote with hybrid onsite attendance in London. Requires 5+ years of experience in enterprise data environments, strong GCP skills, and banking sector experience. Competitive day rate offered.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 17, 2026
🕒 - Duration
3 to 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Outside IR35
-
🔒 - Security
Yes
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Scala #Documentation #Migration #SQL (Structured Query Language) #Datasets #Data Architecture #Airflow #Cloud #Collibra #"ETL (Extract #Transform #Load)" #Storage #Batch #GCP (Google Cloud Platform) #dbt (data build tool) #BigQuery #Data Quality #Data Governance #Metadata #Dataflow #Monitoring #Microservices #Deployment #Data Engineering
Role description
Senior Data Engineer
Duration: Initial 3-month contract (with potential extension - project dependent)
Location: Remote/ London (Hybrid)
Start date: 1st of April
Onsite requirements: Hybrid - primarily remote with on-site attendance in London for key meetings
Engagement Type: Outside IR35
Day Rate: Competitive day rate
Right to work: Candidates must have the legal right to work in the UK. Unfortunately, sponsorship cannot be provided. BPSS clearance is required or candidates must be willing to undergo standard financial services background screening.
Atrium Global is supporting a client of ours within the banking sector who are delivering a large-scale data platform modernisation programme.
Job Overview
We are seeking an experienced Senior Data Engineer to support the design and delivery of a strategic data capability as part of a major UK banking data transformation programme.
The role will focus on building scalable and governed data engineering pipelines within Google Cloud Platform. You will contribute to engineering patterns that generate, standardise, and manage business and technical keys across multiple data products, supporting a wider migration from incumbent warehouse platforms to a modern cloud-based architecture.
The successful candidate will work closely with solution architects, data engineers, and platform teams to build reliable pipelines, implement deterministic identifier generation, and support matching, merge, and data standardisation logic across enterprise data products.
RESPONSIBILITIES
• Design and develop scalable data engineering components supporting enterprise key management patterns across GCP-based data products
• Build pipelines and transformation logic to derive candidate keys and apply matching, merge, cleansing, and standardisation rules
• Implement processes for generating and maintaining surrogate keys, deterministic UUIDs, and master keys across data platforms
• Integrate key generation processes into BigQuery-based data product stores
• Develop and optimise transformation logic using technologies such as dbt, Dataflow, Dataproc and BigQuery SQL
• Support implementation of data protection controls including masking, obfuscation, tokenisation or pseudonymisation of sensitive identifiers
• Collaborate with solution architects, data architects, microservices engineers and platform teams to deliver aligned engineering solutions
• Support testing and validation of pipelines including key generation quality, interoperability, lineage and operational resilience
• Produce clear technical documentation including pipeline designs, implementation standards, and operational runbooks
• Work within governance, change management and release processes required in regulated financial services environments
REQUIREMENTS
• Must have 5+ years experience working as a Data Engineer in enterprise data environments
• The Data Engineer must have strong hands-on experience with Google Cloud Platform including BigQuery, Dataflow, Dataproc and Cloud Storage
• Experience working within the banking or regulated financial services sector.
• Strong SQL engineering skills with experience designing transformations for large-scale structured datasets
• Experience using dbt or similar transformation frameworks for modular data modelling and deployment
• Experience building pipelines involving record standardisation, matching, merge logic and identity resolution patterns
• Strong understanding of surrogate keys, business keys, deterministic identifiers and modern data platform key management
• Experience handling sensitive data with enterprise controls such as masking, hashing, tokenisation or pseudonymisation
• Experience designing pipelines with strong data quality, lineage and traceability controls
• Understanding of batch and event-driven processing patterns and operational concerns including monitoring and recovery
• Experience working within regulated environments or financial services programmes
• Strong communication skills and ability to collaborate across engineering and architecture teams
NICE TO HAVE
• Experience working on data platform modernisation or data product oriented architectures
• Understanding of Data Mesh or product-aligned data ownership models
• Experience integrating data platforms with microservices architectures
• Exposure to orchestration tools such as Airflow, Control-M or similar scheduling frameworks
• Experience with data governance and metadata tooling such as Dataplex, Collibra or equivalent
• Experience supporting migration from legacy warehouse platforms to cloud-native data platforms
Senior Data Engineer
Duration: Initial 3-month contract (with potential extension - project dependent)
Location: Remote/ London (Hybrid)
Start date: 1st of April
Onsite requirements: Hybrid - primarily remote with on-site attendance in London for key meetings
Engagement Type: Outside IR35
Day Rate: Competitive day rate
Right to work: Candidates must have the legal right to work in the UK. Unfortunately, sponsorship cannot be provided. BPSS clearance is required or candidates must be willing to undergo standard financial services background screening.
Atrium Global is supporting a client of ours within the banking sector who are delivering a large-scale data platform modernisation programme.
Job Overview
We are seeking an experienced Senior Data Engineer to support the design and delivery of a strategic data capability as part of a major UK banking data transformation programme.
The role will focus on building scalable and governed data engineering pipelines within Google Cloud Platform. You will contribute to engineering patterns that generate, standardise, and manage business and technical keys across multiple data products, supporting a wider migration from incumbent warehouse platforms to a modern cloud-based architecture.
The successful candidate will work closely with solution architects, data engineers, and platform teams to build reliable pipelines, implement deterministic identifier generation, and support matching, merge, and data standardisation logic across enterprise data products.
RESPONSIBILITIES
• Design and develop scalable data engineering components supporting enterprise key management patterns across GCP-based data products
• Build pipelines and transformation logic to derive candidate keys and apply matching, merge, cleansing, and standardisation rules
• Implement processes for generating and maintaining surrogate keys, deterministic UUIDs, and master keys across data platforms
• Integrate key generation processes into BigQuery-based data product stores
• Develop and optimise transformation logic using technologies such as dbt, Dataflow, Dataproc and BigQuery SQL
• Support implementation of data protection controls including masking, obfuscation, tokenisation or pseudonymisation of sensitive identifiers
• Collaborate with solution architects, data architects, microservices engineers and platform teams to deliver aligned engineering solutions
• Support testing and validation of pipelines including key generation quality, interoperability, lineage and operational resilience
• Produce clear technical documentation including pipeline designs, implementation standards, and operational runbooks
• Work within governance, change management and release processes required in regulated financial services environments
REQUIREMENTS
• Must have 5+ years experience working as a Data Engineer in enterprise data environments
• The Data Engineer must have strong hands-on experience with Google Cloud Platform including BigQuery, Dataflow, Dataproc and Cloud Storage
• Experience working within the banking or regulated financial services sector.
• Strong SQL engineering skills with experience designing transformations for large-scale structured datasets
• Experience using dbt or similar transformation frameworks for modular data modelling and deployment
• Experience building pipelines involving record standardisation, matching, merge logic and identity resolution patterns
• Strong understanding of surrogate keys, business keys, deterministic identifiers and modern data platform key management
• Experience handling sensitive data with enterprise controls such as masking, hashing, tokenisation or pseudonymisation
• Experience designing pipelines with strong data quality, lineage and traceability controls
• Understanding of batch and event-driven processing patterns and operational concerns including monitoring and recovery
• Experience working within regulated environments or financial services programmes
• Strong communication skills and ability to collaborate across engineering and architecture teams
NICE TO HAVE
• Experience working on data platform modernisation or data product oriented architectures
• Understanding of Data Mesh or product-aligned data ownership models
• Experience integrating data platforms with microservices architectures
• Exposure to orchestration tools such as Airflow, Control-M or similar scheduling frameworks
• Experience with data governance and metadata tooling such as Dataplex, Collibra or equivalent
• Experience supporting migration from legacy warehouse platforms to cloud-native data platforms





