DELTACLASS TECHNOLOGY SOLUTIONS LIMITED

Data Modeler- Snowflake

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Modeler specializing in Snowflake, offering a contract length of "X months" at a pay rate of "$X/hour." Candidates should have strong insurance data domain experience, advanced data modeling skills, and proficiency in SQL and AWS services.
🌎 - Country
United Kingdom
πŸ’± - Currency
Β£ GBP
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
February 26, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Data Engineering #Airflow #Databases #Cloud #Documentation #Athena #Data Pipeline #Physical Data Model #Compliance #GDPR (General Data Protection Regulation) #Snowflake #SQL (Structured Query Language) #Data Architecture #Data Lake #dbt (data build tool) #Data Documentation #Scala #AWS (Amazon Web Services) #Redshift #Agile #BI (Business Intelligence)
Role description
β€’ Support the organisation’s shift to a data-driven operating model by translating business needs into scalable data models β€’ Build enterprise-scale, cloud-hosted analytics capabilities β€’ Enable trusted, secure, and high-quality data for analytics/ BI β€’ Work across insurance data domains (policy, claims, underwriting, finance) β€’ Partner closely with data engineers, analysts, and business teams β€’ Ensure data platforms meet performance, governance, and compliance needs β€’ Operate within an Agile, collaborative delivery environment Craft & Skills β€’ Design and maintain conceptual, logical, and physical data models β€’ Apply star and snowflake schemas for analytical workloads β€’ Analyse business requirements and translate them into data models β€’ Contribute to enterprise data architecture and platform design β€’ Work with data engineers to implement models using dbt β€’ Optimise data models and tables for query performance β€’ Collaborate across teams using Agile tools and ways of working β€’ Ensure data models comply with governance and GDPR standards β€’ Maintain data documentation, dictionaries, and lineage β€’ Strong experience in the insurance data domain β€’ Advanced data modelling expertise across multiple schemas β€’ Hands-on experience with data warehousing and data lakes β€’ Proficiency in SQL, including performance tuning β€’ Experience with AWS data services (Athena, Redshift) β€’ Understanding of analytical and transactional databases β€’ Experience aligning data architecture with GDPR requirements β€’ Strong communication skills with technical and non-technical stakeholders β€’ Knowledge of dbt and Airflow for data pipelines is a bonus β€’