

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer specializing in Neo4j, with a 6-month remote contract inside IR35. Requires 5+ years of data engineering experience, proficiency in SQL and Python, and a Google Professional Data Engineer certification.
π - Country
United Kingdom
π± - Currency
Β£ GBP
-
π° - Day rate
-
ποΈ - Date discovered
July 23, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
Inside IR35
-
π - Security clearance
Unknown
-
π - Location detailed
London Area, United Kingdom
-
π§ - Skills detailed
#Data Management #Libraries #Neo4J #Data Architecture #Big Data #Kafka (Apache Kafka) #Security #Scala #Dataflow #Apache Beam #Storage #IAM (Identity and Access Management) #BigQuery #Data Engineering #Data Ingestion #Data Science #Computer Science #Datasets #Cloud #Data Security #Python #AWS (Amazon Web Services) #Data Pipeline #Spark (Apache Spark) #Azure #Data Catalog #"ETL (Extract #Transform #Load)" #Data Governance #Databases #SQL (Structured Query Language) #GDPR (General Data Protection Regulation) #Knowledge Graph #Metadata #Compliance #Graph Databases #GCP (Google Cloud Platform)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title: Data Engineer - Neo4j
Location: London, UK - Remote
Duration: 6 Months (Extendable)
Employment Type: Inside IR35
Roles & Responsibilities:
About the Role
We are seeking a highly skilled Data Engineer with hands-on experience in Graph Databases (Neo4j) and modern data ingestion and optimization techniques. You will help EMEA clients design intelligent data platforms leveraging GCP services to support complex, connected data use cases and drive performance at scale.
Key Responsibilities
β’ Architect and build data pipelines on GCP integrating structured, semi-structured, and unstructured data sources.
β’ Design, implement, and optimize Graph Database solutions using Neo4j, Cypher queries, and GCP integrations.
β’ Develop ETL/ELT workflows using Dataflow, Pub/Sub, BigQuery, and Cloud Storage.
β’ Design graph models for real-world applications such as fraud detection, network analysis, and knowledge graphs.
β’ Optimize performance of graph queries and design for scalability.
β’ Support ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments.
β’ Implement metadata management, security, and data governance using Data Catalog and IAM.
β’ Work across functional teams and clients in diverse EMEA time zones and project settings.
Minimum Qualifications
β’ Bachelor's degree in Computer Science, Engineering, or a related field.
β’ 5+ years of experience in data engineering, including 2+ years with Neo4j or another Graph DB platform.
β’ Proficiency in SQL, Python, and Cypher query language.
β’ Strong hands-on experience with BigQuery, Dataflow, Pub/Sub, and Cloud Storage.
β’ Knowledge of graph theory, graph schema modeling, and data relationship mapping.
β’ Fluent in English.
β’ Certification: Google Professional Data Engineer or equivalent (e.g., AWS Big Data, Azure Data Engineer).
Preferred Qualifications
β’ Overall 8 to 10+ years of experience
β’ Experience in creating open-source Graph DB (Neo4j), data ingestion and optimization techniques
β’ Familiarity with Graph Data Science libraries in Neo4j.
β’ Understanding of data architecture principles, data mesh, and distributed processing.
β’ Prior experience in customer-facing roles or professional services.
β’ Background in data security, compliance (e.g., GDPR), and regional data residency awareness.