

GIOS Technology
Senior Data Engineer (Azure/DataBricks)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer (Azure/DataBricks) based in Manchester, UK, with a contract length of unspecified duration. The pay rate is also unspecified. Key skills required include Azure Data Services, Databricks, Python, and data governance, with experience in the Lloyd’s of London market preferred.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
488
-
🗓️ - Date
February 28, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Manchester Area, United Kingdom
-
🧠 - Skills detailed
#Azure #Data Governance #Automation #Data Engineering #DevOps #Azure DevOps #Spark (Apache Spark) #PySpark #Azure Databricks #Delta Lake #Cloud #Data Lifecycle #Scala #ADF (Azure Data Factory) #Leadership #Spark SQL #SQL (Structured Query Language) #Data Lake #Observability #Strategy #Vault #Data Strategy #Data Lineage #Compliance #Python #Databricks
Role description
We are looking for Lead Azure Databricks Engineer at Manchester, UK – 2 days per week Onsite
Role Description:
• We are seeking an exceptional Lead Azure & Databricks Engineer to drive the design, build, optimisation, and governance of our enterprise data platforms. The ideal candidate combines deep technical expertise with hands-on leadership, excels in complex and highly regulated environments, and has proven delivery experience within the Lloyd’s of London market. You will play a pivotal role in shaping our cloud data strategy, enabling advanced analytics, and ensuring robust, scalable and secure solutions across our organisation. This is 100% hands on role.
Key Responsibilities
• Lead the development, and optimisation of large-scale data platforms using Azure (ADF, Data Lake, Key Vault, Azure Functions) and Databricks (Delta Lake, PySpark, Unity Catalog).
• Work closely with underwriting, actuarial, delegated authority, bordereaux, exposure management, Reinsurance, Finance Solvency II and risk functions to deliver solutions aligned to Lloyd’s of London regulatory and operational requirements, including compliance, auditability, and data lineage.
• Partner with architects, SMEs, product owners, and leadership to translate business needs into scalable cloud data solutions.
• Own and resolve complex technical challenges with speed and clarity—someone who consistently gets the job done.
• Champion best-practice design patterns, data lifecycle management, CI/CD, automation, and cloud-native engineering principles.
• Provide technical leadership and mentoring, setting engineering standards and uplifting capability across the team.
• Drive continuous improvement, proactively identifying opportunities to enhance platform performance, reduce cost, and improve resilience.
• Oversee end-to-end data engineering delivery, ensuring high-quality pipelines, reliability, observability, and performance.
Required Experience & Skills
• Expert-level engineering experience across Azure Data Services and Databricks in enterprise-scale environments.
• Deep understanding of Delta Lake, Medallion architecture, distributed compute, Lakehouse patterns, data modelling and optimisation.
• Strong Python, PySpark and Spark SQL skills, plus experience implementing CI/CD for data solutions (Azure DevOps, Branching and merging).
• Solid understanding of data governance, lineage, access controls, FinOps, and secure cloud engineering practices.
• Outstanding communication skills—able to interface confidently with senior stakeholders, articulate complex concepts clearly, and lead cross-functional discussions.
• Delivery-focused mindset with a proven track record of ownership, accountability, and seeing initiatives through to completion.
• Experience working in fast-paced, high-expectation environments with tight deadlines and evolving requirements.
We are looking for Lead Azure Databricks Engineer at Manchester, UK – 2 days per week Onsite
Role Description:
• We are seeking an exceptional Lead Azure & Databricks Engineer to drive the design, build, optimisation, and governance of our enterprise data platforms. The ideal candidate combines deep technical expertise with hands-on leadership, excels in complex and highly regulated environments, and has proven delivery experience within the Lloyd’s of London market. You will play a pivotal role in shaping our cloud data strategy, enabling advanced analytics, and ensuring robust, scalable and secure solutions across our organisation. This is 100% hands on role.
Key Responsibilities
• Lead the development, and optimisation of large-scale data platforms using Azure (ADF, Data Lake, Key Vault, Azure Functions) and Databricks (Delta Lake, PySpark, Unity Catalog).
• Work closely with underwriting, actuarial, delegated authority, bordereaux, exposure management, Reinsurance, Finance Solvency II and risk functions to deliver solutions aligned to Lloyd’s of London regulatory and operational requirements, including compliance, auditability, and data lineage.
• Partner with architects, SMEs, product owners, and leadership to translate business needs into scalable cloud data solutions.
• Own and resolve complex technical challenges with speed and clarity—someone who consistently gets the job done.
• Champion best-practice design patterns, data lifecycle management, CI/CD, automation, and cloud-native engineering principles.
• Provide technical leadership and mentoring, setting engineering standards and uplifting capability across the team.
• Drive continuous improvement, proactively identifying opportunities to enhance platform performance, reduce cost, and improve resilience.
• Oversee end-to-end data engineering delivery, ensuring high-quality pipelines, reliability, observability, and performance.
Required Experience & Skills
• Expert-level engineering experience across Azure Data Services and Databricks in enterprise-scale environments.
• Deep understanding of Delta Lake, Medallion architecture, distributed compute, Lakehouse patterns, data modelling and optimisation.
• Strong Python, PySpark and Spark SQL skills, plus experience implementing CI/CD for data solutions (Azure DevOps, Branching and merging).
• Solid understanding of data governance, lineage, access controls, FinOps, and secure cloud engineering practices.
• Outstanding communication skills—able to interface confidently with senior stakeholders, articulate complex concepts clearly, and lead cross-functional discussions.
• Delivery-focused mindset with a proven track record of ownership, accountability, and seeing initiatives through to completion.
• Experience working in fast-paced, high-expectation environments with tight deadlines and evolving requirements.






