

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer based in London, offering a full-time permanent contract. Key skills include Azure Databricks, Delta Lake, and ETL processes. Candidates should have experience in data governance, CI/CD, and Python. Duration exceeds six months.
π - Country
United Kingdom
π± - Currency
Β£ GBP
-
π° - Day rate
-
ποΈ - Date discovered
July 19, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
On-site
-
π - Contract type
Fixed Term
-
π - Security clearance
Unknown
-
π - Location detailed
London, England, United Kingdom
-
π§ - Skills detailed
#Metadata #SQL (Structured Query Language) #Azure #Data Governance #Microsoft Power BI #ADF (Azure Data Factory) #Automated Testing #Data Architecture #PySpark #Python #IoT (Internet of Things) #Scala #Synapse #Datasets #GIT #Visualization #"ETL (Extract #Transform #Load)" #Batch #Databricks #Kafka (Apache Kafka) #Knowledge Graph #Azure DevOps #Data Layers #Deployment #SQL Queries #Data Quality #DevOps #Monitoring #Data Engineering #Data Transformations #Agile #Azure Data Factory #BI (Business Intelligence) #Spark (Apache Spark) #Azure Databricks #Data Management #Data Pipeline #Cloud #Delta Lake
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
The role
Position: Data Engineer
Contract type: Full Time/Permanent
Reporting to: Head of Data
Location: London
Overview of role
Zodiac Maritime is undergoing an exciting data transformation, and weβre looking for a talented Data Engineer to join our growing data team. In this role, youβll be instrumental in building and deploying modern Azure Databricks based data solutions, enabling the business to make faster, data-driven decisions.
Youβll work hands-on with Azure Databricks, Azure Data Factory, Delta Lake, and Power BI to design scalable data pipelines, implement efficient data models, and ensure high-quality data delivery. This is a fantastic opportunity to shape the future of data at Zodiac Maritime while working with cutting-edge cloud technologies.
Key Responsibilities And Primary Deliverables
β’ Design, develop, and optimize end-to-end data pipelines (batch & streaming) using Azure Databricks, Spark, and Delta Lake.
β’ Implement Medallion Architecture to structure raw, enriched, and curated data layers efficiently.
β’ Build scalable ETL/ELT processes with Azure Data Factory and PySpark.
β’ Work with Data Architecture to enforce data governance using Azure Purview and Unity Catalog for metadata management, lineage, and access control.
β’ Ensure data consistency, accuracy, and reliability across pipelines.
β’ Collaborate with analysts to validate and refine datasets for reporting.
β’ Apply DevOps & CI/CD best practices (Git, Azure DevOps) for automated testing and deployment.
β’ Optimize Spark jobs, Delta Lake tables, and SQL queries for performance and cost efficiency.
β’ Troubleshoot and resolve data pipeline issues proactively.
β’ Partner with Data Architects, Analysts, and Business Teams to deliver end-to-end solutions.
β’ Stay ahead of emerging data technologies (e.g., streaming with Kafka/Event Hubs, Knowledge Graphs).
β’ Advocate for best practices in data engineering across the organization.
Skills profile
Relevant experience & education
β’ Hands-on experience with Azure Databricks, Delta Lake, Data Factory, and Synapse.
β’ Strong understanding of Lakehouse architecture and medallion design patterns.
β’ Proficient in Python, PySpark, and SQL (advanced query optimization).
β’ Experience building scalable ETL pipelines and data transformations.
β’ Knowledge of data quality frameworks and monitoring.
β’ Experience with Git, CI/CD pipelines, and Agile methodologies.
β’ Ability to write clean, maintainable, and well-documented code.
β’ Experience of Power BI or other visualization tools.
β’ Knowledge of IoT data pipelines.