

BMLL
Databricks Lead Engineer (Contract)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Databricks Lead Engineer (Contract) with a hybrid work location. The contract length and pay rate are unspecified. Key skills required include Databricks, Delta Lake, AWS, and experience in financial data handling.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 24, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
London, England, United Kingdom
-
🧠 - Skills detailed
#Source Code Control #Metadata #Strategy #AWS (Amazon Web Services) #Apache Iceberg #AWS S3 (Amazon Simple Storage Service) #Python #Linux #S3 (Amazon Simple Storage Service) #Unit Testing #Delta Lake #Snowflake #Databricks #Storage #"ETL (Extract #Transform #Load)" #Migration
Role description
About BMLL:
BMLL is the leading independent provider of harmonised Level 3, 2 and 1 historical data and analytics across global equities, ETFs, futures and US equity options. We provide market participants with immediate access to granular T+1 order book data and advanced analytics, enabling them to accelerate research, optimise trading
strategies, and better understand market behaviour.
BMLL was acquired in 2025 by Nordic Capital, alongside minority shareholder Optiver, marking a joint commitment to accelerate the company's next phase of growth.
We offer an inclusive and collaborative culture, a hybrid working environment that includes regular days in our London office, weekly team lunches, and a variety of out-of-hours social activities.
For more information, visit our website or follow us on X (@bmlltech) and LinkedIn @BMLL.
About the Role:
BMLL transforms financial exchanges' raw market data into an accessible and normalised view for customers across multiple use-cases. We hold over 10 years of full-depth market data across approximately 100 venues, this currently consists of about 1.5 billion HDF5 files (:1.5 PB) stored on S3 and catalogued in Postgres. These are partitioned by date, venue, and instrument.
While this structure has served us well, the enormous number of small files, with the Postgres catalogue in separate storage, is now the main constraint on how we evolve, back-up/recover, and share data. Our strategic direction is to migrate this processed layer to Delta Lake for both data and catalogue. In addition we are migrating our Data Products which currently sit in Snowflake over to Iceberg compatible Delta Tables in order to leverage benefits of performance, cost and manageability.
We are looking for an experienced Databricks Lead Engineer and Architect to join and support the Delta Lake and Delta Table re-engineering projects. This is a hands-on role and with direct contribution to the project.
The goal is not merely format migration, but addressing key architectural objectives for high-cardinality data:
• Implement a Delta Lake architecture for large scale
• Model and partition L2/L3 order book data in Delta lake
• Implement metadata, compaction, and versioning
• Design a system that supports multiple delivery models downstream
• Implement a viable backup strategy at this scale with a 1 day RTO
This is an opportunity for an experienced engineer to join a cutting-edge FinTech company and make an impact on a critical and large-scale re-engineering project.
Requirements
ESSENTIAL:
• Industry experience with Databricks
• Delta Lake
• Unity Catalog
• Delta Tables
• Delta UniForm (Universal Format)
• Industry experience with Apache iceberg
• AWS, S3 Tables, Lake formation
• Industry experience in developing on a Linux platform
• Experience with industry-standard development methodologies such as source code control, unit testing and continuous integration
• A self starter with the ability to self-organise
• Strong problem-solving skills
• Strong communication skills
DESIRABLE:
• Industry experience with Snowflake,
• Industry experience with petabyte scale data volumes
• Industry experience with Python
• Experience working with financial data
About BMLL:
BMLL is the leading independent provider of harmonised Level 3, 2 and 1 historical data and analytics across global equities, ETFs, futures and US equity options. We provide market participants with immediate access to granular T+1 order book data and advanced analytics, enabling them to accelerate research, optimise trading
strategies, and better understand market behaviour.
BMLL was acquired in 2025 by Nordic Capital, alongside minority shareholder Optiver, marking a joint commitment to accelerate the company's next phase of growth.
We offer an inclusive and collaborative culture, a hybrid working environment that includes regular days in our London office, weekly team lunches, and a variety of out-of-hours social activities.
For more information, visit our website or follow us on X (@bmlltech) and LinkedIn @BMLL.
About the Role:
BMLL transforms financial exchanges' raw market data into an accessible and normalised view for customers across multiple use-cases. We hold over 10 years of full-depth market data across approximately 100 venues, this currently consists of about 1.5 billion HDF5 files (:1.5 PB) stored on S3 and catalogued in Postgres. These are partitioned by date, venue, and instrument.
While this structure has served us well, the enormous number of small files, with the Postgres catalogue in separate storage, is now the main constraint on how we evolve, back-up/recover, and share data. Our strategic direction is to migrate this processed layer to Delta Lake for both data and catalogue. In addition we are migrating our Data Products which currently sit in Snowflake over to Iceberg compatible Delta Tables in order to leverage benefits of performance, cost and manageability.
We are looking for an experienced Databricks Lead Engineer and Architect to join and support the Delta Lake and Delta Table re-engineering projects. This is a hands-on role and with direct contribution to the project.
The goal is not merely format migration, but addressing key architectural objectives for high-cardinality data:
• Implement a Delta Lake architecture for large scale
• Model and partition L2/L3 order book data in Delta lake
• Implement metadata, compaction, and versioning
• Design a system that supports multiple delivery models downstream
• Implement a viable backup strategy at this scale with a 1 day RTO
This is an opportunity for an experienced engineer to join a cutting-edge FinTech company and make an impact on a critical and large-scale re-engineering project.
Requirements
ESSENTIAL:
• Industry experience with Databricks
• Delta Lake
• Unity Catalog
• Delta Tables
• Delta UniForm (Universal Format)
• Industry experience with Apache iceberg
• AWS, S3 Tables, Lake formation
• Industry experience in developing on a Linux platform
• Experience with industry-standard development methodologies such as source code control, unit testing and continuous integration
• A self starter with the ability to self-organise
• Strong problem-solving skills
• Strong communication skills
DESIRABLE:
• Industry experience with Snowflake,
• Industry experience with petabyte scale data volumes
• Industry experience with Python
• Experience working with financial data






