

ValueMomentum
Azure Databricks Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Azure Databricks Engineer with a 12+ year experience in data solutions, focusing on Azure services and the insurance domain. Contract length is unspecified, with a competitive pay rate. Key skills include SQL, Python, and ETL tooling.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 4, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
England, United Kingdom
-
🧠 - Skills detailed
#Data Lake #Jira #Azure Databricks #SQL Server #Cloud #Python #ADF (Azure Data Factory) #Informatica #MDM (Master Data Management) #Agile #Databases #Big Data #Microsoft Azure #Synapse #Azure #Delta Lake #SQL (Structured Query Language) #Storage #DevOps #Security #Azure Synapse Analytics #Databricks #Public Cloud #Scrum #Azure ADLS (Azure Data Lake Storage) #ADLS (Azure Data Lake Storage) #Oracle #Data Ingestion #Data Management #Data Quality #Azure SQL #IICS (Informatica Intelligent Cloud Services) #Data Integration #Data Processing #Azure SQL Database #"ETL (Extract #Transform #Load)" #Azure Data Factory
Role description
We are seeking an experienced Azure Databricks Engineer to design and develop modern data solutions, focusing on cloud-native platforms and architecture.
• 12+ years of experience in data ingestion, data processing, and analytical pipelines for big data and relational databases.
• Over 4 years of hands-on experience with Microsoft Azure services, including Azure Data Lake Storage (ADLS), Azure Databricks, Azure Data Factory, Azure Synapse Analytics, and Azure SQL Database.
• Extensive hands-on experience in SQL, Python, Data Integration/Ingestion and associated patterns - ETL tooling – Informatica IICS, ADF, Notebooks, Databricks, Delta Lake, Warehousing technologies and associated patterns, Cloud platforms – Azure preferred.
• Proven experience in integrating, Modeling, and transforming Insurance domain data, ideally within Lloyd’s, specialty or insurance/reinsurance market.
• Experience with on-prem and cloud versions of databases such as Oracle and SQL Server.
• Experience with Agile delivery frameworks/methodologies (e.g. Scrum, SAFe) and tools (i.e. Jira, AzureDevOps).
• Experience with mass ingestion capabilities and cloud process flows, data quality and master data management
• Understanding data related security challenges and tooling with specific technologies (e.g. Databricks)
• Experience and in-depth knowledge of data delivery and associated architecture principles, data modelling concepts, and all steps of data production process
• Advanced verbal and written communications skills, as well as active listening, along with teamwork.
• Professional certifications in public cloud and tooling – Databricks and Azure are highly desired.
About ValueMomentum
ValueMomentum is a leading solutions provider for the global property & casualty insurance industry, supported by deep domain and technology capabilities. We offer a comprehensive suite of advisory, development, implementation, and maintenance services across the entire P&C insurance value chain. This includes Underwriting, Claims, Distribution, and more, empowering insurers to stay ahead with sustained growth, high performance, and enhanced stakeholder value. Trusted by over 75 insurers, ValueMomentum is one of the largest standalone insurance-focused solutions providers to the US insurance industry.
We are seeking an experienced Azure Databricks Engineer to design and develop modern data solutions, focusing on cloud-native platforms and architecture.
• 12+ years of experience in data ingestion, data processing, and analytical pipelines for big data and relational databases.
• Over 4 years of hands-on experience with Microsoft Azure services, including Azure Data Lake Storage (ADLS), Azure Databricks, Azure Data Factory, Azure Synapse Analytics, and Azure SQL Database.
• Extensive hands-on experience in SQL, Python, Data Integration/Ingestion and associated patterns - ETL tooling – Informatica IICS, ADF, Notebooks, Databricks, Delta Lake, Warehousing technologies and associated patterns, Cloud platforms – Azure preferred.
• Proven experience in integrating, Modeling, and transforming Insurance domain data, ideally within Lloyd’s, specialty or insurance/reinsurance market.
• Experience with on-prem and cloud versions of databases such as Oracle and SQL Server.
• Experience with Agile delivery frameworks/methodologies (e.g. Scrum, SAFe) and tools (i.e. Jira, AzureDevOps).
• Experience with mass ingestion capabilities and cloud process flows, data quality and master data management
• Understanding data related security challenges and tooling with specific technologies (e.g. Databricks)
• Experience and in-depth knowledge of data delivery and associated architecture principles, data modelling concepts, and all steps of data production process
• Advanced verbal and written communications skills, as well as active listening, along with teamwork.
• Professional certifications in public cloud and tooling – Databricks and Azure are highly desired.
About ValueMomentum
ValueMomentum is a leading solutions provider for the global property & casualty insurance industry, supported by deep domain and technology capabilities. We offer a comprehensive suite of advisory, development, implementation, and maintenance services across the entire P&C insurance value chain. This includes Underwriting, Claims, Distribution, and more, empowering insurers to stay ahead with sustained growth, high performance, and enhanced stakeholder value. Trusted by over 75 insurers, ValueMomentum is one of the largest standalone insurance-focused solutions providers to the US insurance industry.