ProSapiens HR

GCP Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer with a 6-month contract, hybrid work in London & Chester, UK, paying competitively. Key skills include GCP, BigQuery, Dataflow, and experience in banking/financial services. GCP certification is a plus.
🌎 - Country
United Kingdom
πŸ’± - Currency
Β£ GBP
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
March 26, 2026
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
Outside IR35
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Greater London, England, United Kingdom
-
🧠 - Skills detailed
#Kubernetes #GCP (Google Cloud Platform) #Security #BigQuery #SQL (Structured Query Language) #Data Engineering #Scala #Data Pipeline #Java #Cloud #Terraform #Data Modeling #Datasets #Dataflow #Apache Beam #Jenkins #Storage #Big Data #Kafka (Apache Kafka) #Batch #Infrastructure as Code (IaC) #Python #dbt (data build tool) #Spark (Apache Spark)
Role description
πŸš€ We’re Hiring: GCP Data Engineer (Contract Role) πŸ“ Location: London & Chester, UK (Hybrid – 2-3 days onsite) πŸ’Ό Client: Leading Bank / Financial Services (Details shared during interview) πŸ“… Duration: 6 Months (Extendable) | Outside IR35 ⏳ Notice Period: Immediate to 1 Month πŸ”Ž About the Role We are looking for a Senior GCP Data Engineer to design and build scalable, high-performance data platforms on Google Cloud. This role involves working with large-scale datasets and developing both batch and real-time data pipelines for enterprise applications. πŸ’‘ Key Responsibilities βœ”οΈ Design & build batch and real-time data pipelines on GCP βœ”οΈ Work with BigQuery, Dataflow, Cloud Storage, and Spanner βœ”οΈ Develop real-time streaming solutions using Kafka / Apache Beam βœ”οΈ Implement Infrastructure as Code using Terraform βœ”οΈ Deploy workloads on Kubernetes (GKE) βœ”οΈ Build and manage CI/CD pipelines (Jenkins / Spinnaker) βœ”οΈ Collaborate on data modeling and architecture βœ”οΈ Ensure scalability, security, and cost optimization βœ… Required Skills πŸ”Ή Strong hands-on experience with GCP πŸ”Ή Expertise in BigQuery, Dataflow / Apache Beam πŸ”Ή Experience in batch & streaming data engineering πŸ”Ή Knowledge of Kafka / Spark / Big Data tools πŸ”Ή Proficiency in Python / Java / SQL πŸ”Ή Experience with Terraform, Kubernetes, and CI/CD πŸ”Ή Good understanding of data modeling ⭐ Nice to Have βž• DBT experience βž• Banking / Financial domain experience βž• GCP Certification (Associate / Data Engineer) πŸ“© Interested candidates can share their updated resume: βœ‰οΈ Email: prajakta@prosapiens.in πŸ”— Referrals would be highly appreciated!