SR2

Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with strong GCP experience, focusing on building scalable data pipelines. It is a part-time, fully remote position in the UK, with an initial 6-month contract and a pay rate of £450-£500 per day.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
500
-
🗓️ - Date
March 14, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Outside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
United Kingdom
-
🧠 - Skills detailed
#GCP (Google Cloud Platform) #"ETL (Extract #Transform #Load)" #SaaS (Software as a Service) #SQL (Structured Query Language) #Data Engineering #Data Pipeline #Python #Data Processing #Cloud #Replication #Spark (Apache Spark) #Scala #Airflow #Data Ingestion
Role description
GCP - Senior Data Engineer Location: Fully remote – UK Contract length: Initial 6 months Working frequency: Part-time Rate: £450-£500 per day IR35: Outside IR35 Tech: GCP / Datastream / SQL / Python SR2 is supporting a client delivering a large-scale enterprise data platform on Google Cloud Platform (GCP) and is looking for a Senior Data Engineer to join the team on an initial 6-month contract. This is a hands-on engineering role focused on building and maintaining cloud-native data pipelines and enabling the ingestion of data from multiple sources including on-premise systems and SaaS platforms. The environment involves complex enterprise integrations, with a particular focus on bringing distributed data sources into a centralised cloud data platform. Role: • Design, build and maintain scalable data pipelines on GCP • Ingest and integrate data from on-premise systems into cloud environments • Support integration of SaaS data sources into the organisation’s internal data platform • Work with enterprise data ingestion technologies such as Datastream or equivalent replication tools • Build and optimise data ingestion and transformation workflows • Contribute to cloud architecture discussions and data platform design • Ensure pipelines are reliable, secure and performant in production environments Requirements: • Strong Google Cloud Platform (GCP) experience • Proven ability to design and build scalable data pipelines • Experience integrating data from on-premise environments into cloud platforms • Experience with Google Datastream or similar replication technologies • Experience ingesting data from external SaaS systems • Strong SQL and Python skills • Understanding of cloud networking considerations for data ingestion • Experience working in enterprise-scale data environments • Experience with Airflow or other orchestration frameworks • Exposure to distributed data processing frameworks such as Spark • Google Associate Cloud Engineer or Professional Cloud Engineer certification (desirable) Please apply with a copy of your CV and Emma from SR2 will contact potential candidates regarding next steps.