Senior Data Modeler (Data Vault 2.0 | Erwin | Insurance P&C)

⭐ - Featured Role | Apply direct with Data Freelance Hub
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 16, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Jersey City, NJ
-
🧠 - Skills detailed
#Snowflake #Security #SQL (Structured Query Language) #Normalization #Cloud #Vault #Data Catalog #ERWin #Data Vault #GCP (Google Cloud Platform) #Data Integration #Data Quality #"ETL (Extract #Transform #Load)" #Data Lineage #Data Management #Data Governance #Data Mart #BI (Business Intelligence) #Metadata #Scala #AWS (Amazon Web Services) #Data Warehouse #Data Modeling #Azure
Role description
Job description Overview We are seeking a seasoned Data Modeler to design, implement, and govern data models for our Insurance Property & Casualty (P&C) business. The ideal candidate will have hands-on experience with Data Vault 2.0 methodology, proficiency with Erwin Data Modeler, and strong dimensional modeling skills to support analytics, reporting, and data integration initiatives. Key Responsibilities Lead the end-to-end data modeling lifecycle for P&C insurance domains using Data Vault 2.0 concepts (Hubs, Links, Satellites, PIT tables, and satellites). Collaborate with enterprise architects, BI teams, data governance, and stakeholders to translate business requirements into scalable data models. Design and optimize dimensional models (fact and dimension tables) to support OLAP/BI and self-service analytics. Develop and maintain data warehouses and data marts aligned with Data Vault 2.0 standards and best practices. Create and maintain ER/Dim models in Erwin Data Modeler, including normalization/denormalization strategies, naming conventions, and metadata. Implement and enforce data governance, security, and quality controls in modeling artifacts. Perform data lineage, impact analysis, and metadata management. Mentor junior modelers and contribute to modeling standards, templates, and best practices. Collaborate with ETL/ELT developers to ensure correct mapping, historical tracking, and PIT/BI windowing requirements. Stay current with industry trends in data vaulting, data warehousing, and insurance data domains (policies, claims, exposures, premiums, payments, reinsurance, etc.). Required Qualifications 5+ years of experience in data modeling within data warehousing environments; 2+ years with Data Vault 2.0. Strong expertise in Data Vault 2.0 architecture (Hubs, Links, Satellites, PIT, Satellite History) and business keys. Proficiency with Erwin Data Modeler (versions supporting Cloud/BD environments is a plus). Deep knowledge of dimensional modeling (star/snowflake schemas) and Kimball/IFG methodologies. Domain experience in Property & Casualty Insurance (claims, policies, underwriting, premiums, endorsements, settlements, reserves, etc.). SQL proficiency; ability to review and write complex transformations. Experience with ETL/ELT tools and data integration patterns. Data governance, data quality, and metadata management experience. Excellent communication, stakeholder management, and problem-solving skills. Nice-to-Haves Experience with other modeling tools (erwin alternatives, e.g., PowerDesigner) or data catalog tools. Familiarity with cloud data platforms (AWS, Azure, GCP) and containerized data environments. Knowledge of regulatory reporting requirements in insurance (NAIC, SOLVENCY II, IFRS 17 considerations as applicable). Certification in Data Vault 2.0 (e.g., IIL Data Vault 2.0 Certification). What success looks like Delivered scalable Data Vault 2.0 models that enable accurate historical tracking and fast analytics. Clean, well-documented Erwin models with clear lineage and metadata. Efficient collaboration with IT and business stakeholders, delivering on-time data models that meet governance and quality standards.