Norton Blake

GCP Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer, contracted for 6 months at £625 per day, remote. Key skills include strong ETL/ELT experience, GCP data services, SQL, and modern data architecture expertise. Data Vault 2.0 and BigQuery experience are advantageous.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
640
-
🗓️ - Date
April 24, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Inside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
United Kingdom
-
🧠 - Skills detailed
#Data Pipeline #Observability #GCP (Google Cloud Platform) #Data Vault #Data Ingestion #Data Engineering #Deployment #Scala #SQL (Structured Query Language) #Data Architecture #Automation #Monitoring #Vault #Cloud #"ETL (Extract #Transform #Load)" #Data Framework #BigQuery
Role description
Senior Data Engineer | GCP | Build Modern Data Platforms Contract - 6 Months Inside IR35 £625 per day Remote We’re hiring a Senior Data Engineer to design, build, and optimise scalable data pipelines on Google Cloud Platform. This is a hands-on role for someone with a self-starter mindset who enjoys working with modern data architectures and open-source frameworks. What you’ll be doing: • Designing and building ETL/ELT pipelines • Developing scalable data workflows on GCP • Implementing robust data ingestion frameworks • Working with structured and semi-structured data • Collaborating with Data Modelling & Analytics teams • Driving data reliability, monitoring, and observability • Automating deployments and workflows • Contributing to tooling and framework decisions What we’re looking for: • Strong ETL/ELT pipeline experience • Proven GCP data services expertise • Strong SQL and data transformation skills • Experience with orchestration and pipeline automation • Background in modern data architectures (lakehouse/warehouse) • Proactive, ownership-driven mindset Nice to have: • Data Vault 2.0 exposure • BigQuery optimisation experience • Open-source data framework experience • CI/CD for data pipelines If you’re interested in building scalable, modern data platforms, feel free to reach out or apply.