

Galent
Data Modeler Architect
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Modeler Architect with a 12-month contract, offering a pay rate of "X" per hour. Required skills include ANSI-SQL, Dimensional Data Modelling, ERWIN, MS SQL Server, and Snowflake, with insurance domain experience essential.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 18, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
San Diego, CA
-
🧠 - Skills detailed
#dbt (data build tool) #JSON (JavaScript Object Notation) #Data Mart #Agile #Normalization #ERWin #Dimensional Modelling #Vault #MS SQL (Microsoft SQL Server) #Data Integration #Data Mapping #Data Analysis #Snowflake #SQL Server #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #BI (Business Intelligence) #Data Vault #Data Warehouse #Physical Data Model #Data Engineering #Documentation #Microsoft Power BI
Role description
Mandatory Skills : ANSI-SQL, Dimensional Data Modelling, ERWIN, MS SQL Server, Snowflake
Good to Have Skills : Data Vault (ODS) Modelling, ER Studio
Job Responsibilities
Design and develop Logical Physical Data Models for Data Warehouse Data Mart layers in Snowflake
Create and maintain Source to Target Mapping STM documents ensure accurate data mapping for policy claims and financial domains
Define data entities attributes and relationships based on business requirements data sources and analytical needs
Validate transformation logic for downstream consumption layers used by DBT pipelines and Power BI models
Conduct in-depth data analysis and profiling including JSON payload understanding and complex domain structures
Ensure conformance to dimensional modelling normalization denormalization practices
Collaborate closely with Data Engineers to design optimized structures supporting DBT pipelines and Snowflake performance considerations
Technical Primary Skills
12 years of extensive hands-on experience in Data modelling with Insurance domain background
Experience working in Data warehousing environment
Experience in translating business requirements into logical and physical data model
Design and implement effective data models in Snowflake optimizing for analytical and operational reporting needs
Ability to perform data analysis on source data to come up with relevant models
Experience in developing SCD1SCD2SCD3 Dimension Data models and data warehouse
Strong experience working with Erwin Data Modeler for conceptual logical and physical models
Strong in Snowflake data warehousing concepts staging curated layers marts
Strong SQL skills with the ability to validate transformations and model outputs
Knowledge of handling JSONXML structures schema interpretation and source payload analysis
Strong understanding of data integration issues validation and cleaning familiarity with complex data and structures
Experience working with agile methodology
Secondary Skills
Insurance Domain Expertise in LOB like Workers Compensation WC Commercial Multi-Peril CMP Excess Surplus ES Excess Casualty XS Casualty
Soft Skills
Strong analytical thinking and ability to interpret complex business logic
Proactive communication and documentation skills
Ability to work in fast paced multi-LOB data programs
Mandatory Skills : ANSI-SQL, Dimensional Data Modelling, ERWIN, MS SQL Server, Snowflake
Good to Have Skills : Data Vault (ODS) Modelling, ER Studio
Job Responsibilities
Design and develop Logical Physical Data Models for Data Warehouse Data Mart layers in Snowflake
Create and maintain Source to Target Mapping STM documents ensure accurate data mapping for policy claims and financial domains
Define data entities attributes and relationships based on business requirements data sources and analytical needs
Validate transformation logic for downstream consumption layers used by DBT pipelines and Power BI models
Conduct in-depth data analysis and profiling including JSON payload understanding and complex domain structures
Ensure conformance to dimensional modelling normalization denormalization practices
Collaborate closely with Data Engineers to design optimized structures supporting DBT pipelines and Snowflake performance considerations
Technical Primary Skills
12 years of extensive hands-on experience in Data modelling with Insurance domain background
Experience working in Data warehousing environment
Experience in translating business requirements into logical and physical data model
Design and implement effective data models in Snowflake optimizing for analytical and operational reporting needs
Ability to perform data analysis on source data to come up with relevant models
Experience in developing SCD1SCD2SCD3 Dimension Data models and data warehouse
Strong experience working with Erwin Data Modeler for conceptual logical and physical models
Strong in Snowflake data warehousing concepts staging curated layers marts
Strong SQL skills with the ability to validate transformations and model outputs
Knowledge of handling JSONXML structures schema interpretation and source payload analysis
Strong understanding of data integration issues validation and cleaning familiarity with complex data and structures
Experience working with agile methodology
Secondary Skills
Insurance Domain Expertise in LOB like Workers Compensation WC Commercial Multi-Peril CMP Excess Surplus ES Excess Casualty XS Casualty
Soft Skills
Strong analytical thinking and ability to interpret complex business logic
Proactive communication and documentation skills
Ability to work in fast paced multi-LOB data programs






