HoK Consulting - Technical Recruitment

Lead Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a long-term contract for a Lead Data Engineer based in the UK (hybrid, 2-3 days in Birmingham) with a focus on global banking projects. Key skills include SQL/NoSQL, ETL tools, and cloud platforms (AWS, Azure, GCP).
🌎 - Country
United Kingdom
πŸ’± - Currency
Β£ GBP
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
November 14, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
Fixed Term
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Birmingham, England, United Kingdom
-
🧠 - Skills detailed
#NiFi (Apache NiFi) #Security #Big Data #Data Quality #Java #Azure #Cloud #Airflow #GCP (Google Cloud Platform) #Spark (Apache Spark) #SQL (Structured Query Language) #Data Science #Hadoop #Compliance #Data Engineering #Data Pipeline #Apache NiFi #"ETL (Extract #Transform #Load)" #Python #DevOps #AWS (Amazon Web Services) #Databases #Scala #NoSQL
Role description
Job Title: Lead Data Engineer Duration: long-term contract Location: UK Based Hybrid (x2-3 days from Birmingham) Visa: No sponsorship available/ No PSW Overview: We’re looking for a hands-on Lead Data Engineer to design, build, and optimize scalable data pipelines and infrastructure for global banking projects. You’ll lead a team, drive engineering best practices, and ensure data reliability, security, and compliance. Key Responsibilities: β€’ Lead and mentor a team of data engineers, establishing best practices in data engineering. β€’ Design, build, and maintain scalable, secure, and reliable data pipelines and architectures. β€’ Develop and optimize ETL processes, integrating diverse data sources into unified platforms. β€’ Ensure data quality, governance, and compliance with regulatory requirements. β€’ Collaborate with data scientists, analysts, and IT teams to support analytical and business goals. β€’ Monitor, troubleshoot, and enhance data performance and infrastructure. Key Skills & Experience: β€’ Strong experience with SQL/NoSQL databases, data warehousing, and big data (Hadoop, Spark). β€’ Proficient in Python, Java, or Scala with solid OOP and design pattern understanding. β€’ Expertise in ETL tools, DevOps and orchestration frameworks (Airflow, Apache NiFi). β€’ Hands-on experience with cloud platforms (AWS, Azure, or GCP) and their data services.