

N Consulting Global
Data Lakehouse Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Lakehouse Engineer, contract length unspecified, with a pay rate of "unknown." Work is on-site in Dublin, Basildon, or Bad Homburg. Key skills include Apache Iceberg, PySpark, Python, and AWS Data Lake Engineering.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
May 16, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Inside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
Basildon, England, United Kingdom
-
🧠 - Skills detailed
#Data Lakehouse #AWS (Amazon Web Services) #Data Processing #Scala #Data Engineering #Data Lake #PySpark #AWS Glue #S3 (Amazon Simple Storage Service) #Cloud #Apache Iceberg #Python #Spark (Apache Spark) #"ETL (Extract #Transform #Load)"
Role description
🚨 Hiring Now – Data Lakehouse Engineer | Europe Locations
We are looking for experienced Data Lakehouse Engineers with strong expertise in:
✔️ Apache Iceberg
✔️ PySpark
✔️ Python
✔️ AWS Data Lake Engineering
📍 Locations:
• Dublin, Ireland
• Basildon, UK
• Bad Homburg, Germany
💼 Key Responsibilities:
• Build and optimize scalable data lakehouse platforms
• Develop high-performance PySpark pipelines
• Design cloud-native AWS data solutions
• Work on large-scale data processing and transformation
✅ Preferred Experience:
• AWS Glue / EMR / S3
• Spark optimization
• Lakehouse architecture
• Modern data engineering practices
Interested candidates can share their updated CV or connect directly.
#Hiring #DataEngineer #PySpark #ApacheIceberg #AWS #Lakehouse #BigData #CloudEngineering #IrelandJobs #UKJobs #GermanyJobs
🚨 Hiring Now – Data Lakehouse Engineer | Europe Locations
We are looking for experienced Data Lakehouse Engineers with strong expertise in:
✔️ Apache Iceberg
✔️ PySpark
✔️ Python
✔️ AWS Data Lake Engineering
📍 Locations:
• Dublin, Ireland
• Basildon, UK
• Bad Homburg, Germany
💼 Key Responsibilities:
• Build and optimize scalable data lakehouse platforms
• Develop high-performance PySpark pipelines
• Design cloud-native AWS data solutions
• Work on large-scale data processing and transformation
✅ Preferred Experience:
• AWS Glue / EMR / S3
• Spark optimization
• Lakehouse architecture
• Modern data engineering practices
Interested candidates can share their updated CV or connect directly.
#Hiring #DataEngineer #PySpark #ApacheIceberg #AWS #Lakehouse #BigData #CloudEngineering #IrelandJobs #UKJobs #GermanyJobs





