

Osmii
Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer (Lead Databricks Engineer) on a 6-month hybrid contract in London, offering expertise in Databricks, Delta Lake, and AWS. Requires financial services experience and strong Python skills for managing petabyte-scale data.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 21, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Cloud #Metadata #Python #GIT #Programming #Delta Lake #AWS S3 (Amazon Simple Storage Service) #"ETL (Extract #Transform #Load)" #Data Engineering #Databricks #Unit Testing #Datasets #Snowflake #AWS (Amazon Web Services) #Migration #S3 (Amazon Simple Storage Service) #Strategy #Data Management #Linux #Apache Iceberg #Source Code Control
Role description
Lead Databricks Engineer
London - Hybrid
Contract - 6 months initial length.
We are seeking a Hands-on Lead Databricks Engineer and Architect to spearhead this transformation. You will be directly responsible for re-engineering our order book data and ensuring our platform can support high-cardinality data at scale.
Objectives:
• Architecture Design: Implement a robust Delta Lake architecture capable of handling petabyte-scale financial data.
• Data Modelling: Model and partition high-cardinality order book data for optimal query performance.
• Platform Governance: Implement advanced metadata management, compaction, and versioning via Unity Catalog.
• Interoperability: Leverage Delta UniForm to ensure seamless compatibility with Apache Iceberg.
• Resilience: Design and implement a viable backup and recovery strategy for massive datasets with a strict 1-day RTO.
Essential skills:
• Databricks Mastery: Expert-level experience with the Databricks ecosystem, specifically Delta Lake, Delta Tables, and Unity Catalog.
• Open Table Formats: Deep industry experience with Apache Iceberg and Delta UniForm.
• Cloud Infrastructure: Proven experience with AWS (S3 Tables, Lake Formation) and managing data at scale in the cloud.
• Environment: Strong proficiency in developing on Linux platforms.
• Engineering Rigour: Mastery of modern development methodologies, including source code control (Git), unit testing, and CI/CD.
Preferred Experience:
• Financial Services: Experience working with high-frequency financial or market data.
• Scale: Direct experience managing petabyte-scale data volumes.
• Programming: Strong Python development skills within a data engineering context.
• Ecosystem Knowledge: Previous experience with Snowflake to assist in the migration strategy.
Lead Databricks Engineer
London - Hybrid
Contract - 6 months initial length.
We are seeking a Hands-on Lead Databricks Engineer and Architect to spearhead this transformation. You will be directly responsible for re-engineering our order book data and ensuring our platform can support high-cardinality data at scale.
Objectives:
• Architecture Design: Implement a robust Delta Lake architecture capable of handling petabyte-scale financial data.
• Data Modelling: Model and partition high-cardinality order book data for optimal query performance.
• Platform Governance: Implement advanced metadata management, compaction, and versioning via Unity Catalog.
• Interoperability: Leverage Delta UniForm to ensure seamless compatibility with Apache Iceberg.
• Resilience: Design and implement a viable backup and recovery strategy for massive datasets with a strict 1-day RTO.
Essential skills:
• Databricks Mastery: Expert-level experience with the Databricks ecosystem, specifically Delta Lake, Delta Tables, and Unity Catalog.
• Open Table Formats: Deep industry experience with Apache Iceberg and Delta UniForm.
• Cloud Infrastructure: Proven experience with AWS (S3 Tables, Lake Formation) and managing data at scale in the cloud.
• Environment: Strong proficiency in developing on Linux platforms.
• Engineering Rigour: Mastery of modern development methodologies, including source code control (Git), unit testing, and CI/CD.
Preferred Experience:
• Financial Services: Experience working with high-frequency financial or market data.
• Scale: Direct experience managing petabyte-scale data volumes.
• Programming: Strong Python development skills within a data engineering context.
• Ecosystem Knowledge: Previous experience with Snowflake to assist in the migration strategy.






