Norton Blake

Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer on a 6-month contract in London, requiring weekly/bi-weekly travel to Germany. Key skills include GCP, ETL/ELT pipeline design, advanced SQL, and experience with modern data architectures.
🌎 - Country
United Kingdom
πŸ’± - Currency
Β£ GBP
-
πŸ’° - Day rate
680
-
πŸ—“οΈ - Date
February 26, 2026
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
Inside IR35
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Data Engineering #Data Ingestion #Storage #Cloud #Data Pipeline #Data Processing #GCP (Google Cloud Platform) #Observability #SQL (Structured Query Language) #Deployment #Data Architecture #Data Vault #Dataflow #Monitoring #Vault #Scala #"ETL (Extract #Transform #Load)" #BI (Business Intelligence) #BigQuery
Role description
Senior Data Engineer Location: London - Weekly / Bi Weekly Travel to Germany Sector: Events and Live Entertainment Work Type: Contract - 6 Months Inside IR35 The Client My client is a leading organisation within the events and live entertainment space, delivering large-scale live experiences to millions of customers each year. With a growing digital and data function, they are investing in a modern, cloud-first data platform to power commercial insight, audience analytics, operational reporting and real-time decision-making across venues and events. They are now seeking a Senior Data Engineer to play a key role in building and optimising their Google Cloud-based data platform. The Role As a Senior Data Engineer, you will design, build and optimise scalable data pipelines on Google Cloud Platform. You will take ownership of core data engineering patterns, evaluate and implement modern open-source frameworks, and help shape the evolution of the organisation’s cloud data architecture. This role requires a proactive, self-starter mindset with the ability to operate independently while collaborating closely with Data Modelling, Analytics and wider technology teams. Key Responsibilities β€’ Design, build and maintain scalable ETL and ELT pipelines β€’ Develop and optimise data processing workflows on Google Cloud Platform β€’ Implement robust and scalable data ingestion frameworks β€’ Work with structured and semi-structured data sources across multiple business domains β€’ Collaborate closely with Data Modelling and Analytics teams to enable reliable downstream reporting and insight β€’ Ensure data reliability through monitoring, observability and quality controls β€’ Automate deployment processes and data workflows wherever possible β€’ Contribute to tooling decisions, framework selection and data platform standards Required Experience β€’ Strong experience designing and building ETL and ELT pipelines in production environments β€’ Hands-on experience with GCP data services such as BigQuery, Cloud Storage, Dataflow or equivalent β€’ Advanced SQL and strong data transformation capability β€’ Experience with orchestration tools and automated pipeline scheduling β€’ Experience working within modern data architectures including warehouse and lakehouse patterns β€’ Demonstrated ownership mindset with strong problem-solving capability Desirable Experience β€’ Exposure to Data Vault 2.0 modelling concepts β€’ Experience optimising performance and cost within BigQuery β€’ Experience evaluating and implementing open-source data engineering frameworks β€’ Experience implementing CI/CD for data pipelines