Norton Blake

GCP Data Engineer - Dataflow

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer - Dataflow on a 6-month contract, paying up to £650 per day, fully remote. Key skills include ETL/ELT pipeline experience, expertise in GCP and Google Cloud Dataflow, and strong SQL skills.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
648
-
🗓️ - Date
May 6, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Inside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
United Kingdom
-
🧠 - Skills detailed
#Monitoring #Dataflow #Batch #Scala #Data Framework #Data Vault #GCP (Google Cloud Platform) #Data Engineering #"ETL (Extract #Transform #Load)" #BigQuery #Data Pipeline #Vault #SQL (Structured Query Language) #Deployment #Data Architecture #Data Ingestion #Observability #Automation #Cloud
Role description
GCP Data Engineer - Dataflow 6 Month Contract Inside IR35 Up to £650 per day Remote We’re hiring a Senior Data Engineer to design, build, and optimise scalable data pipelines on Google Cloud Platform. This is a hands-on role for someone with a self-starter mindset who enjoys working with modern data architectures and open-source frameworks. What you’ll be doing: • Designing and building ETL/ELT pipelines • Developing scalable data workflows on GCP, with a strong focus on Google Cloud Dataflow • Implementing robust data ingestion frameworks using batch and streaming approaches • Working with structured and semi-structured data • Collaborating with Data Modelling & Analytics teams • Driving data reliability, monitoring, and observability • Automating deployments and workflows • Contributing to tooling and framework decisions What we’re looking for: • Strong ETL/ELT pipeline experience • Proven GCP data services expertise, including hands-on experience with Google Cloud Dataflow (essential) • Strong SQL and data transformation skills • Experience with orchestration and pipeline automation • Background in modern data architectures (lakehouse/warehouse) • Proactive, ownership-driven mindset Nice to have: • Data Vault 2.0 exposure • BigQuery optimisation experience • Open-source data framework experience • CI/CD for data pipelines