

Data Modeler – Market Insurance
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Modeler in Market Insurance, offering a hybrid contract in London for 2-3 days per week. Key requirements include expertise in data modeling, SQL proficiency, and experience with insurance systems and cloud platforms.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
-
🗓️ - Date discovered
May 22, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Hybrid
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Greater London, England, United Kingdom
-
🧠 - Skills detailed
#Data Lake #ERWin #Apache Spark #Azure DevOps #Agile #DevOps #Data Architecture #Jira #SQL (Structured Query Language) #Delta Lake #Cloud #Data Modeling #GCP (Google Cloud Platform) #Spark (Apache Spark) #dbt (data build tool) #Data Pipeline #AWS (Amazon Web Services) #Azure
Role description
I am hiring for Data Modeler – Market Insurance
Location: London - Hybrid / 2-3 days Per week in office
• Proven experience in Market insurance, with deep understanding of banking market data structures and regulatory reporting.
• Expertise in data modeling (conceptual, logical, physical) using tools such as Erwin, ER/Studio, or dbt.
• Strong knowledge of data warehousing, data lakes, and enterprise data architecture.
• Proficiency in SQL and experience with cloud data platforms (Azure, AWS, or GCP).
• Experience with insurance systems such as Guidewire, Duck Creek, or legacy PAS platforms.
• Knowledge of Delta Lake, Apache Spark, and modern data pipeline tools.
• Exposure to Agile delivery methodologies and tools like JIRA, Confluence, or Azure DevOps.
Key Skills: Market insurance /Guidewire / Duck Creek / legacy PAS / AWS / SQL
I am hiring for Data Modeler – Market Insurance
Location: London - Hybrid / 2-3 days Per week in office
• Proven experience in Market insurance, with deep understanding of banking market data structures and regulatory reporting.
• Expertise in data modeling (conceptual, logical, physical) using tools such as Erwin, ER/Studio, or dbt.
• Strong knowledge of data warehousing, data lakes, and enterprise data architecture.
• Proficiency in SQL and experience with cloud data platforms (Azure, AWS, or GCP).
• Experience with insurance systems such as Guidewire, Duck Creek, or legacy PAS platforms.
• Knowledge of Delta Lake, Apache Spark, and modern data pipeline tools.
• Exposure to Agile delivery methodologies and tools like JIRA, Confluence, or Azure DevOps.
Key Skills: Market insurance /Guidewire / Duck Creek / legacy PAS / AWS / SQL