Hays

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Azure Databricks Engineer in London, lasting until 31/12/2026, with a pay rate of £500/day inside IR35. Key skills include SQL, Python, and data integration, with insurance domain experience preferred. BPSS eligibility required.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
500
-
🗓️ - Date
February 14, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Inside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#MDM (Master Data Management) #Data Pipeline #"ETL (Extract #Transform #Load)" #Data Analysis #Vault #Azure Databricks #IICS (Informatica Intelligent Cloud Services) #Databricks #SQL (Structured Query Language) #Data Strategy #ADF (Azure Data Factory) #Databases #Data Management #Data Vault #SQL Server #Delta Lake #Data Architecture #Code Reviews #Jira #Data Modeling #DevOps #Informatica #Data Engineering #Strategy #Security #Public Cloud #Python #Azure #Data Integration #Cloud #Oracle #Agile #Scrum #Data Quality
Role description
Description CONTRACTOR MUST BE ELIGIBLE FOR BPSS Role Title: Azure Databricks Engineer Location: London Duration: 31/12/2026 Work setup: 3 days onsite/week Rate: £500/day Inside IR35 (via Umbrella payroll) Role Description: • Strong hands-on skills that can be leveraged directly in the deliverable and/or ensuring that their team is effectively working. • Design, build and maintain data pipelines and ELT workflows on Databricks platform with Medalion architecture • Analyses data requirements and provides data analysis techniques, and applies data modeling (including data vault) and data quality techniques to establish, modify or maintain data structures and their associated components in complex environments • Partner with data architects, data analysts, product manager, and testers to deliver reliable data sets. • Document data flows, transformation logic and processes for knowledge sharing and ongoing support. • Passionate about solving problems, enjoy connecting the dots between data, strategy and analytics, obsess with generating tangible benefits and high performance. • Collaborate in agile teams ,participate in code reviews, solution design and platform evolution Skills and Experience • Extensive hands-on experience in SQL, Python, Data Integration/Ingestion and associated patterns - ETL tooling – Informatica IICS, ADF, Notebooks, Databricks, Delta Lake, Warehousing technologies and associated patterns, Cloud platforms – Azure preferred. • Proven experience in integrating, modeling and transforming Insurance domain data, ideally within Lloyd’s, specialty or insurance/reinsurance market. • Experience with on-prem and cloud versions of databases such as Oracle and SQL Server. • Experience with Agile delivery frameworks/methodologies (e.g. Scrum, SAFe) and tools (i.e. Jira, AzureDevOps). • Experience with mass ingestion capabilities and cloud process flows, data quality and master data management • Understanding data related security challenges and tooling with specific technologies (e.g. Databricks) • Experience and in-depth knowledge of data delivery and associated architecture • principles, data modelling concepts, and all steps of data production process • Advanced verbal and written communications skills, as well as active listening, along with teamwork. • Professional certifications in public cloud and tooling – Databricks and Azure are highly desired.