

GCP Data Engineer-UK
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer with a contract duration in London, UK (Remote), offering a competitive pay rate. Requires 5+ years in data engineering, 2+ years with Neo4j, proficiency in SQL, Python, and relevant certifications.
π - Country
United Kingdom
π± - Currency
Β£ GBP
-
π° - Day rate
-
ποΈ - Date discovered
June 13, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Remote
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
United Kingdom
-
π§ - Skills detailed
#Computer Science #Data Management #Data Engineering #Knowledge Graph #Metadata #Spark (Apache Spark) #BigQuery #Apache Beam #Neo4J #AWS (Amazon Web Services) #Scala #Data Governance #Big Data #Security #Data Catalog #Python #Dataflow #Data Pipeline #Azure #IAM (Identity and Access Management) #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Datasets #Storage #GCP (Google Cloud Platform) #Kafka (Apache Kafka) #Cloud
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Hi Professionals,
Greetings From Ampstek!!
Hope you are high in your spirits. Please stay safe.
Our client is looking for GCP Data Engineer to join a high-growth organization. If you are interested share you resume sudhakaran.m@ampstek.com
Role: GCP Data Engineer
Location: London, UK (Remote)
Duration: Contract
Job Description:
Key Responsibilities
β’ Architect and build data pipelines on GCP integrating structured, semi-structured, and unstructured data sources.
β’ Design, implement, and optimize Graph Database solutions using Neo4j, Cypher queries, and GCP integrations.
β’ Develop ETL/ELT workflows using Dataflow, Pub/Sub, BigQuery, and Cloud Storage.
β’ Design graph models for real-world applications such as fraud detection, network analysis, and knowledge graphs.
β’ Optimize performance of graph queries and design for scalability.
β’ Support ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments.
β’ Implement metadata management, security, and data governance using Data Catalog and IAM.
β’ Work across functional teams and clients in diverse EMEA time zones and project settings.
Minimum Qualifications
β’ Bachelor's degree in Computer Science, Engineering, or a related field.
β’ 5+ years of experience in data engineering, including 2+ years with Neo4j or another Graph DB platform.
β’ Proficiency in SQL, Python, and Cypher query language.
β’ Strong hands-on experience with BigQuery, Dataflow, Pub/Sub, and Cloud Storage.
β’ Knowledge of graph theory, graph schema modeling, and data relationship mapping.
β’ Fluent in English.
β’ Certification: Google Professional Data Engineer or equivalent (e.g., AWS Big Data,Azure Data Engineer).
Thanks & Regards,
Sudhakaran
IT Recruiter | Europe & UK
Email - sudhakaran.m@ampstek.com
Tel - +44(20)45150009
Ampstek Services Limited