

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in London on a full-time, permanent contract, focusing on Azure Databricks, Data Factory, and Delta Lake. Key skills include Python, PySpark, SQL, and experience in building scalable ETL pipelines.
π - Country
United Kingdom
π± - Currency
Β£ GBP
-
π° - Day rate
-
ποΈ - Date discovered
June 19, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
On-site
-
π - Contract type
Fixed Term
-
π - Security clearance
Unknown
-
π - Location detailed
London, England, United Kingdom
-
π§ - Skills detailed
#GIT #Visualization #Microsoft Power BI #Data Management #IoT (Internet of Things) #ADF (Azure Data Factory) #SQL Queries #Cloud #Agile #Delta Lake #Spark (Apache Spark) #Data Engineering #Azure DevOps #"ETL (Extract #Transform #Load)" #Knowledge Graph #Data Transformations #DevOps #Data Layers #Azure Databricks #Monitoring #Deployment #Metadata #Datasets #Azure #Databricks #Python #PySpark #SQL (Structured Query Language) #Azure Data Factory #Automated Testing #Synapse #Kafka (Apache Kafka) #Data Governance #Data Architecture #BI (Business Intelligence) #Data Quality #Data Pipeline #Batch #Scala
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
The role
Position: Data Engineer
Contract type: Full Time/Permanent
Reporting to: Head of Data
Location: London
Overview of role
Zodiac Maritime is undergoing an exciting data transformation, and weβre looking for a talented Data Engineer to join our growing data team. In this role, youβll be instrumental in building and deploying modern Azure Databricks based data solutions, enabling the business to make faster, data-driven decisions.
Youβll work hands-on with Azure Databricks, Azure Data Factory, Delta Lake, and Power BI to design scalable data pipelines, implement efficient data models, and ensure high-quality data delivery. This is a fantastic opportunity to shape the future of data at Zodiac Maritime while working with cutting-edge cloud technologies.
Key Responsibilities And Primary Deliverables
β’ Design, develop, and optimize end-to-end data pipelines (batch & streaming) using Azure Databricks, Spark, and Delta Lake.
β’ Implement Medallion Architecture to structure raw, enriched, and curated data layers efficiently.
β’ Build scalable ETL/ELT processes with Azure Data Factory and PySpark.
β’ Work with Data Architecture to enforce data governance using Azure Purview and Unity Catalog for metadata management, lineage, and access control.
β’ Ensure data consistency, accuracy, and reliability across pipelines.
β’ Collaborate with analysts to validate and refine datasets for reporting.
β’ Apply DevOps & CI/CD best practices (Git, Azure DevOps) for automated testing and deployment.
β’ Optimize Spark jobs, Delta Lake tables, and SQL queries for performance and cost efficiency.
β’ Troubleshoot and resolve data pipeline issues proactively.
β’ Partner with Data Architects, Analysts, and Business Teams to deliver end-to-end solutions.
β’ Stay ahead of emerging data technologies (e.g., streaming with Kafka/Event Hubs, Knowledge Graphs).
β’ Advocate for best practices in data engineering across the organization.
Skills profile
Relevant experience & education
β’ Hands-on experience with Azure Databricks, Delta Lake, Data Factory, and Synapse.
β’ Strong understanding of Lakehouse architecture and medallion design patterns.
β’ Proficient in Python, PySpark, and SQL (advanced query optimization).
β’ Experience building scalable ETL pipelines and data transformations.
β’ Knowledge of data quality frameworks and monitoring.
β’ Experience with Git, CI/CD pipelines, and Agile methodologies.
β’ Ability to write clean, maintainable, and well-documented code.
β’ Experience of Power BI or other visualization tools.
β’ Knowledge of IoT data pipelines.