GIOS Technology

Data Engineer - (Databricks/ETL/Informatica IICS/ADF/Delta Lake/Azure/Insurance/reinsurance)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with expertise in Databricks, ETL, and Azure, requiring insurance industry experience. It offers a hybrid work location in London, a competitive pay rate, and focuses on building scalable data pipelines and ELT workflows.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 9, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Agile #Data Integration #Delta Lake #Vault #Data Modeling #IICS (Informatica Intelligent Cloud Services) #Databricks #Datasets #SQL (Structured Query Language) #Cloud #Data Vault #Python #Azure #Metadata #Scala #Informatica #Data Pipeline #Data Analysis #"ETL (Extract #Transform #Load)" #Data Engineering #ADF (Azure Data Factory)
Role description
I am hiring for Data Engineer - (Databricks/ETL/Informatica IICS/ADF/Delta Lake/Azure/Insurance/reinsurance) Location: London (Hybrid – 2–3 days onsite weekly) Job Description We are hiring a skilled Data Engineer to design, build, and maintain scalable data pipelines and ELT workflows in a cloud environment. The role involves collaborating with architects, analysts, and product teams to deliver high-quality, reliable datasets for enterprise use. The ideal candidate brings strong hands-on engineering capability, insurance domain exposure, and a passion for solving data-centric problems. Key Responsibilities: • Design, build, and maintain ELT pipelines on Databricks using Medallion architecture. • Perform data analysis and apply modeling techniques (including Data Vault) to support complex data structures. • Integrate datasets across on-prem and cloud systems, ensuring data reliability and quality. • Collaborate with architects, product managers, analysts, and testers in agile delivery squads. • Document data flows, transformations, and metadata for support and knowledge sharing. • Leverage SQL and Python to enable cloud-native, high-performance data solutions. Key Skills SQL, Python, Databricks, Delta Lake, Azure, ETL, Data Integration, Data Modeling, Data Vault, Informatica IICS, ADF, Data Warehousing, Agile, Insurance