

Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a 6-month contract, offering £500.00-£550.00 per day. Key skills include 4+ years in GCP, SQL, BigQuery, and Neo4j. A Bachelor's degree and Google Professional Data Engineer certification are required. Remote work.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
550
-
🗓️ - Date discovered
July 10, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Remote
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Remote
-
🧠 - Skills detailed
#Data Security #Airflow #Data Engineering #Data Ingestion #Compliance #Schema Design #Neo4J #Data Analysis #Datasets #"ETL (Extract #Transform #Load)" #Data Manipulation #GDPR (General Data Protection Regulation) #Data Processing #Scala #Data Architecture #Code Reviews #Spark (Apache Spark) #Programming #Apache Beam #Visualization #Data Lake #Java #Scripting #Dataflow #Big Data #Cloud #Data Science #Azure #Clustering #Python #Libraries #GIT #Data Pipeline #AWS (Amazon Web Services) #SQL (Structured Query Language) #Apache Airflow #Computer Science #Hadoop #Storage #BigQuery #Apache Spark #GCP (Google Cloud Platform) #Version Control #Security #Data Integrity
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
OverviewWe are seeking a skilled Data Engineer to join our dynamic team. The ideal candidate will be responsible for designing, constructing, and maintaining scalable data pipelines and architectures. You will play a crucial role in transforming raw data into actionable insights, enabling our organisation to make data-driven decisions. This position requires a strong understanding of data warehousing concepts and proficiency in various programming languages and tools.
Required Qualifications
· Bachelor's degree in Computer Science, Engineering, or a related field.
· or another Graph DB platform.
· 4+ years of hands-on experience as a Data Engineer, with at least 2+ years specifically working with Google Cloud Platform (GCP) data services.
· Strong proficiency in SQL and experience with schema design and query optimization for large datasets.
· Expertise in BigQuery, including advanced SQL, partitioning, clustering, and performance tuning.
· Hands-on experience with at least one of the following GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow).
· Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data manipulation and pipeline development.
· Understanding of data warehousing and data lake concepts and best practices.
· Experience with version control systems (e.g., Git).
· - Proficiency in SQL, Python, and Cypher query language.
· - Strong hands-on experience with BigQuery, Dataflow, Pub/Sub, and Cloud Storage.
· - Knowledge of graph theory, graph schema modeling, and data relationship mapping.
· Fluent in English.
· - Certification: Google Professional Data Engineer or equivalent (e.g., AWS Big Data, Azure Data Engineer).
Preferred Qualifications
· - Overall 8 to 10+ years of experience
· - Experience in creating open-source Graph DB (Neo4j), data ingestion and optimization techniques
· - Familiarity with Graph Data Science libraries in Neo4j.
· - Understanding of data architecture principles, data mesh, and distributed processing.
· - Prior experience in customer-facing roles or professional services.
· - Background in data security, compliance (e.g., GDPR), and regional data residency awareness.
Responsibilities
Design, develop, and maintain data pipelines and architectures using SQL and AWS technologies.
Collaborate with data scientists to optimize performance tuning and data retrieval processes.
Implement and manage data warehousing solutions, ensuring data integrity and accessibility.
Utilize tools such as Spark and Neo4j for data analysis and visualization.
Document processes and workflows, and participate in code reviews using Git.
Job Type: Fixed term contractContract length: 6 months
Pay: £500.00-£550.00 per day
Benefits:
On-site parking
Work from home
Schedule:
Day shift
Monday to Friday
Experience:
total: 8 years (required)
GCP: 4 years (required)
Neo4j : 3 years (required)
Graph DB platform: 3 years (required)
BigQuery: 4 years (required)
Python, Java, Scala: 4 years (required)
data mesh, and distributed processing: 3 years (required)
Work Location: Remote