MarkJames Search

Data Lakehouse Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Lakehouse Engineer in Basildon, UK, with a 12-month contract. Pay rate is competitive. Key skills include AWS data engineering, PySpark, Python, and Apache Iceberg. Experience with AWS services and data pipeline development is required.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 3, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Fixed Term
-
🔒 - Security
Unknown
-
📍 - Location detailed
Essex, England, United Kingdom
-
🧠 - Skills detailed
#Data Architecture #IAM (Identity and Access Management) #Data Pipeline #Automation #Data Lakehouse #Data Quality #S3 (Amazon Simple Storage Service) #Infrastructure as Code (IaC) #Terraform #PySpark #Data Security #Data Lake #Airflow #Scala #Spark (Apache Spark) #Data Engineering #Lambda (AWS Lambda) #Security #Python #AWS (Amazon Web Services) #DevOps #Apache Iceberg
Role description
Job Title: Data Lakehouse Engineer Location: Basildon, UK 5 days onsite Monday-Friday Contract Duration: 12 Months We are hiring a Data Lakehouse Engineer to build and maintain scalable data pipelines and lakehouse architectures on AWS. You will play a key role in enabling data-driven decision-making by delivering reliable and high-performance data solutions. Key Responsibilities • Develop and maintain AWS-based data lake solutions • Build data pipelines using PySpark and Python • Implement and manage Apache Iceberg-based lakehouse architecture • Work with AWS services including S3, EMR, Lambda, EKS, MWAA • Ensure data quality, integrity, and security • Collaborate with data architects and analysts to deliver data solutions • Implement workflow orchestration and automation Required Skills • Strong experience in AWS data engineering • Hands-on expertise in PySpark and Python • Experience with Apache Iceberg • Familiarity with AWS services (IAM, Lambda, EKS, S3, EMR, MWAA) • Experience building scalable data pipelines Nice to Have • CI/CD pipelines and DevOps practices • Terraform / Infrastructure as Code • Data quality frameworks • Data security best practices • Workflow orchestration tools (Airflow)