

DELTACLASS TECHNOLOGY SOLUTIONS LIMITED
Data Modeler with Insurance and Snowflake
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Modeler with insurance experience, focusing on Snowflake. It offers a contract length of "X months" at a pay rate of "$X/hour." Key skills include advanced data modeling, SQL proficiency, and AWS services.
π - Country
United Kingdom
π± - Currency
Β£ GBP
-
π° - Day rate
Unknown
-
ποΈ - Date
March 24, 2026
π - Duration
Unknown
-
ποΈ - Location
Unknown
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Greater London, England, United Kingdom
-
π§ - Skills detailed
#Physical Data Model #Data Lake #Redshift #GDPR (General Data Protection Regulation) #Databases #Scala #Cloud #Agile #Data Pipeline #Documentation #AWS (Amazon Web Services) #BI (Business Intelligence) #Compliance #Athena #Data Documentation #Data Engineering #SQL (Structured Query Language) #Data Architecture #dbt (data build tool) #Airflow #Snowflake
Role description
Overview of role
β’ Support the organisationβs shift to a data-driven operating model by translating business needs into scalable data models
β’ Build enterprise-scale, cloud-hosted analytics capabilities
β’ Enable trusted, secure, and high-quality data for analytics/ BI
β’ Work across insurance data domains (policy, claims, underwriting, finance)
β’ Partner closely with data engineers, analysts, and business teams
β’ Ensure data platforms meet performance, governance, and compliance needs
β’ Operate within an Agile, collaborative delivery environment
Craft & Skills
β’ Design and maintain conceptual, logical, and physical data models
β’ Apply star and snowflake schemas for analytical workloads
β’ Analyse business requirements and translate them into data models
β’ Contribute to enterprise data architecture and platform design
β’ Work with data engineers to implement models using dbt
β’ Optimise data models and tables for query performance
β’ Collaborate across teams using Agile tools and ways of working
β’ Ensure data models comply with governance and GDPR standards
β’ Maintain data documentation, dictionaries, and lineage
β’ Strong experience in the insurance data domain
β’ Advanced data modelling expertise across multiple schemas
β’ Hands-on experience with data warehousing and data lakes
β’ Proficiency in SQL, including performance tuning
β’ Experience with AWS data services (Athena, Redshift)
β’ Understanding of analytical and transactional databases
β’ Experience aligning data architecture with GDPR requirements
β’ Strong communication skills with technical and non-technical stakeholders
β’ Knowledge of dbt and Airflow for data pipelines is a bonus
Overview of role
β’ Support the organisationβs shift to a data-driven operating model by translating business needs into scalable data models
β’ Build enterprise-scale, cloud-hosted analytics capabilities
β’ Enable trusted, secure, and high-quality data for analytics/ BI
β’ Work across insurance data domains (policy, claims, underwriting, finance)
β’ Partner closely with data engineers, analysts, and business teams
β’ Ensure data platforms meet performance, governance, and compliance needs
β’ Operate within an Agile, collaborative delivery environment
Craft & Skills
β’ Design and maintain conceptual, logical, and physical data models
β’ Apply star and snowflake schemas for analytical workloads
β’ Analyse business requirements and translate them into data models
β’ Contribute to enterprise data architecture and platform design
β’ Work with data engineers to implement models using dbt
β’ Optimise data models and tables for query performance
β’ Collaborate across teams using Agile tools and ways of working
β’ Ensure data models comply with governance and GDPR standards
β’ Maintain data documentation, dictionaries, and lineage
β’ Strong experience in the insurance data domain
β’ Advanced data modelling expertise across multiple schemas
β’ Hands-on experience with data warehousing and data lakes
β’ Proficiency in SQL, including performance tuning
β’ Experience with AWS data services (Athena, Redshift)
β’ Understanding of analytical and transactional databases
β’ Experience aligning data architecture with GDPR requirements
β’ Strong communication skills with technical and non-technical stakeholders
β’ Knowledge of dbt and Airflow for data pipelines is a bonus






