MetaRPO

Sr Data Modeler (Property and Casualty)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr Data Modeler with Property and Casualty Insurance domain experience. Contract length is unspecified, with a pay rate of "unknown". Key skills include ER modeling, SQL proficiency, and collaboration with various teams.
🌎 - Country
United States
💱 - Currency
Unknown
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 24, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Edison, NJ
-
🧠 - Skills detailed
#Snowflake #ERWin #Normalization #BI (Business Intelligence) #Indexing #Data Modeling #Leadership #SQL (Structured Query Language) #Datasets #Data Quality #Microsoft Power BI #Business Analysis #Documentation #Scala #DBA (Database Administrator) #Database Design #Data Architecture
Role description
Must have Property and Casualty (P&C) Insurance Domain experience. Role Summary: We’re seeking a hands‑on Data Modeler to design and evolve enterprise‑grade data models that make diverse source data consumption‑ready for reporting, analytics, and advanced insights. You will translate business requirements into conceptual, logical, and physical designs (3NF and dimensional), establish and enforce data standards and naming conventions, and collaborate across Architecture, BA, Engineering, and DBA teams to deliver performant, governed, and scalable models. Responsibilities: Model design & documentation: Create ERDs and data models (conceptual, logical, physical) for relational (3NF) and dimensional (star/snowflake) structures; maintain comprehensive documentation (data dictionaries, entity/attribute catalogs, glossaries). Assist with the development and maturing of data modeling standards, guidelines, processes, and procedure for P&C Insurance. Multi‑source harmonization: Analyze and reverse‑engineer “as‑is” structures from varied source systems; design canonical and conformed models to unify and standardize data for enterprise use. Requirements translation: Partner with Information Architecture and Business Analysis to convert business concepts into precise technical designs, S2T (source‑to‑target) mappings, and DDL with clear integrity rules. Physical design alignment: Work with Data Architects and DBAs to validate physical design choices (partitioning, indexing, constraints) and ensure alignment to performance, reliability, and governance SLAs. Analytics & reporting enablement: Model aggregated and hierarchical structures and define fact/dimension layers, including SCD policies and conformed dimensions, to power BI, self‑service, and operational reporting. Standards & governance: Establish and enforce enterprise data standards (definitions, naming, keys, relationships), embed data quality and lineage considerations, and support stewardship and audit needs. Reference/master data: Define approaches to unify identifiers and entities across systems (golden‑record rules, survivorship, conformance strategies). Collaboration & leadership: Guide onshore/offshore contributors on modeling deliverables, review cycles, and documentation; facilitate stakeholder reviews and sign‑offs. Continuous improvement: Contribute to modeling standards, templates, and checklists; advocate reusable patterns across subject areas and lines of business. Required Skills & Experience Strong data‑modeling expertise using ER diagrams and tools such as Erwin (or ER/Studio, equivalent). Proven delivery of conceptual, logical, and physical models at enterprise scale, spanning 3NF and dimensional approaches. Ability to gather and translate business requirements into clear technical designs, S2T mappings, and DDL (keys, constraints, indexes). SQL proficiency and deep understanding of database design, normalization, referential integrity, and performance considerations. Experience designing for aggregated & hierarchical reporting, SCD policies, and conformed dimensions. Effective collaboration with Architecture/BA/DBA/Engineering teams; excellent written and verbal communication skills. High attention to detail, consistency, and data quality across large and complex datasets