Hadoop Developer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Hadoop Developer, hybrid (2 days/week onsite in London or Sheffield), offering competitive pay. Requires 5+ years in Data Engineering, strong Hadoop and Python skills, Apache Spark experience, and knowledge of Apache Airflow.
🌎 - Country
United Kingdom
πŸ’± - Currency
Β£ GBP
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
August 9, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Hybrid
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Hadoop #Data Engineering #Monitoring #Apache Airflow #Spark (Apache Spark) #Python #Apache Spark #Scripting #HDFS (Hadoop Distributed File System) #Airflow
Role description
Hybrid: 2 days/week onsite in either London or Sheffield Key Skills & Experience Required: β€’ 5+ years of experience in Data Engineering. β€’ Strong hands-on experience with Hadoop (HDFS, Hive, etc.). β€’ Proficient in Python scripting for data transformation and orchestration. β€’ Working experience with Apache Spark (including Spark Streaming). β€’ Solid knowledge of Apache Airflow for pipeline orchestration. β€’ Exposure to infrastructure data analytics or monitoring data is highly preferred. β€’ Excellent problem-solving and performance tuning skills.