

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with 8+ years of experience, offering a 6-month contract, remote/hybrid in the UK. Requires Google Professional Data Engineer certification, proficiency in SQL, Python, and Neo4j, and expertise in GCP services.
π - Country
United Kingdom
π± - Currency
Β£ GBP
-
π° - Day rate
-
ποΈ - Date discovered
June 13, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Hybrid
-
π - Contract type
Fixed Term
-
π - Security clearance
Unknown
-
π - Location detailed
London Area, United Kingdom
-
π§ - Skills detailed
#Computer Science #Data Management #Data Engineering #Knowledge Graph #Metadata #Spark (Apache Spark) #BigQuery #HBase #Apache Beam #Neo4J #AWS (Amazon Web Services) #Database Performance #Data Architecture #Scala #GDPR (General Data Protection Regulation) #Data Governance #Big Data #Security #Data Catalog #Python #Dataflow #Libraries #Data Pipeline #Azure #IAM (Identity and Access Management) #SQL (Structured Query Language) #Data Science #"ETL (Extract #Transform #Load)" #Datasets #Storage #Data Security #GCP (Google Cloud Platform) #Kafka (Apache Kafka) #Cloud
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Weβre Hiring: Data Engineer
Location: UK (Flexible time zone, remote/hybrid available)
Experience: 8+ years
Contract: 6 Months
Language Requirement: Fluent in English
Certification Required: Google Professional Data Engineer or equivalent
Are you an experienced Data Engineer passionate about modern data platforms and graph-based architectures? Join Derisk360 to help enterprise clients across EMEA design intelligent, scalable solutions on GCP that unlock value from complex, connected datasets.
What Youβll Do:
β’ Architect and develop end-to-end data pipelines on Google Cloud Platform (GCP), integrating structured, semi-structured, and unstructured data sources.
β’ Design and implement advanced Graph Database solutions using Neo4j, Cypher queries, and GCP-native integrations.
β’ Create ETL/ELT workflows leveraging GCP services including Dataflow, Pub/Sub, BigQuery, and Cloud Storage.
β’ Model real-world use cases in Neo4j such as fraud detection, knowledge graphs, and network analysis.
β’ Optimize graph database performance, ensure query scalability, and maintain system efficiency.
β’ Manage ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments.
β’ Implement metadata management, security, and data governance using Data Catalog and IAM.
β’ Collaborate with cross-functional teams and clients across diverse EMEA time zones and domains.
What You Bring:
β’ 5+ years of experience in data engineering, including 2+ years with Neo4j or another Graph DB platform.
β’ Proficiency in SQL, Python, and Cypher query language.
β’ Strong hands-on experience with BigQuery, Dataflow, Pub/Sub, and Cloud Storage.
β’ Expertise in graph theory, graph schema modeling, and data relationship mapping.
β’ Bachelorβs degree in Computer Science, Engineering, or a related field.
β’ Fluent in English.
β’ Certification: Google Professional Data Engineer or equivalent (e.g., AWS Big Data, Azure Data Engineer).
Nice to Have:
β’ 8 to 10+ years of overall experience.
β’ Experience with open-source Graph DB tools and ingestion/optimization techniques.
β’ Familiarity with Graph Data Science libraries in Neo4j.
β’ Understanding of data architecture principles, data mesh, and distributed processing.
β’ Prior experience in customer-facing roles or professional services.
β’ Awareness of GDPR, data security, and regional data residency standards.
What Youβll Get:
β’ Lead the design of mission-critical data platforms for clients across EMEA.
β’ Work on cutting-edge graph-based use cases including fraud detection and knowledge graphs.
β’ Hands-on experience with modern cloud-native and open-source technologies.
β’ Join a culture of innovation, engineering excellence, and continuous learning.