

Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in the EMEA region, offering a temporary contract for £483.11-£500.00 per day. Requires 5+ years of data engineering experience, proficiency in SQL and GCP services, and a Google Professional Data Engineer certification.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
500
-
🗓️ - Date discovered
July 5, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Hybrid
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
London
-
🧠 - Skills detailed
#Data Security #Python #SQL (Structured Query Language) #GIT #AWS (Amazon Web Services) #Version Control #Security #Apache Beam #Data Science #Apache Spark #Programming #Storage #Libraries #Dataflow #Schema Design #Scripting #Data Engineering #Data Architecture #Spark (Apache Spark) #Cloud #Data Ingestion #Data Manipulation #Airflow #Apache Airflow #BigQuery #Compliance #Neo4J #Datasets #Scala #Big Data #Clustering #Data Lake #Data Processing #Java #GDPR (General Data Protection Regulation) #Hadoop #Azure #GCP (Google Cloud Platform) #Computer Science
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Data Engineer | EMEA Region
Location: EMEA (Flexible time zone, remote/hybrid available)
Language Requirement: Fluent in English
Certification Required: Google Professional Data Engineer or equivalent
Required Qualifications
· Bachelor's degree in Computer Science, Engineering, or a related field.
· 5+ years of experience in data engineering, including 2+ years with Neo4j or another Graph DB platform.
· 4+ years of hands-on experience as a Data Engineer, with at least 2+ years specifically working with Google Cloud Platform (GCP) data services.
· Strong proficiency in SQL and experience with schema design and query optimization for large datasets.
· Expertise in BigQuery, including advanced SQL, partitioning, clustering, and performance tuning.
· Hands-on experience with at least one of the following GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow).
· Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data manipulation and pipeline development.
· Understanding of data warehousing and data lake concepts and best practices.
·Experience with version control systems (e.g., Git).
· Proficiency in SQL, Python, and Cypher query language.
· Strong hands-on experience with BigQuery, Dataflow, Pub/Sub, and Cloud Storage.
· Knowledge of graph theory, graph schema modeling, and data relationship mapping.
·Fluent in English.
· Certification: Google Professional Data Engineer or equivalent (e.g., AWS Big Data, Azure Data Engineer).
Preferred Qualifications
· Overall 8 to 10+ years of experience
· Experience in creating open-source Graph DB (Neo4j), data ingestion and optimization techniques
· Familiarity with Graph Data Science libraries in Neo4j.
· Understanding of data architecture principles, data mesh, and distributed processing.
· Prior experience in customer-facing roles or professional services.
· Background in data security, compliance (e.g., GDPR), and regional data residency awareness.
Job Types: Temporary, Fixed term contract
Pay: £483.11-£500.00 per day
Work authorisation:
United Kingdom (preferred)