E FinancialCareer

Lead GCP Data Engineer - Near Real-Time Data - Square One Resources

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead GCP Data Engineer focused on near real-time data ingestion, offering a 6-month contract at £500 per day in Birmingham. Key skills include GCP services, CDC technologies, Apache Beam, and team leadership experience.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
500
-
🗓️ - Date
January 30, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Inside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
Birmingham, England, United Kingdom
-
🧠 - Skills detailed
#VPC (Virtual Private Cloud) #Kafka (Apache Kafka) #Data Ingestion #BigQuery #API (Application Programming Interface) #Dataflow #Automation #Clustering #SQL (Structured Query Language) #Python #Terraform #Data Engineering #Apache Beam #GCP (Google Cloud Platform) #Security #Logging #Strategy #Storage #Observability #Cloud #Java #IAM (Identity and Access Management) #Monitoring
Role description
Job Title:Lead Data Engineer - Near Real-Time Ingestion(GCP) Location: Birmingham x2 days per week Salary/Rate: £500 per day Start Date: February Job Type: 6 month contract - Inside IR35 Company Introduction We have an exciting opportunity now available with one of our sector-leading consultancy clients! They are currently looking for a Lead Data Engineer to join their team for a six-month contract. Job Responsibilities/Objectives You will be responsible for owning and driving the design, development, and operational excellence of our (NRT) data ingestion platforms on Google Cloud. In this role, you will architect and deliver low-latency ingestion pipelines, enabling mission-critical, high-throughput data flows from diverse source systems into BigQuery and Cloud SQL using cutting-edge CDC and streaming technologies. • Design and build near real-time ingestion frameworks using Pub/Sub, Kafka, Dataflow (Beam), GCS, BigQuery, and Cloud SQL. • Develop and optimize streaming pipelines leveraging Apache Beam in Java or Python, deployed on Dataflow. • Implement CDC ingestion patterns using Datastream, ensuring resilience against schema drift and low-latency updates. • Utilize advanced BigQuery optimization techniques, including Storage Write API, partitioning, clustering, and materialized views. • Tune streaming workloads for latency, throughput, back pressure management, windowing, and watermarks. • Set up robust dead-letter queues, replay strategies, retries, and error-handling frameworks. • Establish enterprise-grade observability with Cloud Monitoring, Logging, and Trace. • Implement strong security controls across IAM, KMS, VPC Service Controls, Secret Manager, and DLP. • Lead the NRT ingestion roadmap and partner closely with architecture, platform, and analytics teams. • Mentor data engineers and promote engineering excellence, reusable patterns, and automation. Required Skills/Experience The ideal candidate will have the following: • Deep hands-on expertise with Google Cloud Platform services including: • Pub/Sub • • Dataflow • BigQuery • GCS • Cloud Composer • Strong experience with CDC technologies, especially Datastream. • Proficiency in Apache Beam (Java or Python)and SQL. • Solid knowledge of Terraform, CI/CD/GitOps, and container orchestration (e.g., GKE). • Proven experience leading teams, driving technical strategy, and delivering complex ingestion systems at scale. Disclaimer Notwithstanding any guidelines given to level of experience sought, we will consider candidates from outside this range if they can demonstrate the necessary competencies. Square One is acting as both an employment agency and an employment business, and is an equal opportunities recruitment business. Square One embraces diversity and will treat everyone equally. Please see our website for our full diversity statement.