

PlanIT Group, LLC
Data Modeler
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Modeler for a long-term contract in a hybrid setting in Richmond, VA. Requires 3-5 years of enterprise data modeling experience, proficiency in SQL, and familiarity with Azure and Power BI. Banking industry experience preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
May 17, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Richmond, VA
-
🧠 - Skills detailed
#Metadata #SQL (Structured Query Language) #Azure #Normalization #Data Vault #SQL Server #BI (Business Intelligence) #Redshift #Snowflake #Cloud #Databases #Delta Lake #Data Modeling #Semantic Models #BigQuery #Microsoft Power BI #Clustering #ERWin #Documentation #Data Governance #PostgreSQL #Synapse #Data Lake #Oracle #Data Dictionary #Security #Vault
Role description
D ata Modeler
Hybrid Richmond, VA
Long term contract
Design enterprise data models (conceptual, logical, and physical) that enable trusted, reusable, and secure data across the organization.
" Partners effectively with governance, engineering, architecture, analytics, security, and development teams.
" Understands and proliferates modeling standards and patterns aligned to EDM, Data Governance, and industry practices.
" Ensures data models support quality, lineage, privacy, and performance requirements.
" Ensures models are implementable and optimize cost/performance in target platforms.
" Ability to translate complex data concepts into clear business language; produce high-quality documentation and diagrams.
" Keen Business Acumen: Quickly understands domain processes and how data supports decisions.
" Critical/System Thinking: Able to see cross-domain dependencies and design for reuse and extensibility.
" Has a keen attention to detail.
" Intermediate/Advanced Microsoft Office Suite of Tools
" Intermediate/Advanced Data Modeling Skills -- Conceptual, Logical, Physical; Relational (3NF), Dimensional (star/snowflake), Data Vault 2.0, wide tables for lake/ELT, and canonical models for integration.
" Patterns & Techniques: Normalization/denormalization, SCD Type 1/2/3, surrogate keys, conformed dimensions, bridge/junk/helper tables, CDC handling, late/early arriving facts.
" Metadata & Catalog: Business glossary, data dictionary, lineage; catalog tools integration.
" Modeling Standards: Naming conventions, data types, constraints, keys, relationships, and versioning.
" Reference & Master Data: Hierarchies, code sets, golden records; stewardship workflows.
" Modeling Tools: Lucid Chart primary or ER/Studio/ERwin secondary
" Databases (OLTP/OLAP): SQL Server, Oracle, PostgreSQL, Snowflake, Azure Synapse, BigQuery, Redshift.
" Data Lake & Files: Parquet, ORC, Delta Lake; partitioning, Z-ordering, clustering.
" Cloud & Integration: Azure (Synapse, Fabric, SQL DB, Data Factory, Purview)
" BI/Semantic Layer: Power BI, semantic models, measures, and calculation groups (alignment to the logical model).
" Profiling & Quality: DQ; profiling and rule authoring.
" Security/Privacy: Row-/column-level security, dynamic masking, tokenization/encryption, privacy impact assessment alignment.
" SQL (intermediate): Window functions, CTEs, query optimization, index strategies, execution plans.
" Minimum 3-5 years of Data Modeling experience with enterprise scale systems
" Minimum 1-2 years working in MS Azure environments and SQL databases (jSynapse, SQL DB, Data Lake)
" Minimum 1-2 years of working with Power BI semantic modeling and KPI alignment
" Minimum 3-5 years of working with SQL
" Experience in banking, mortgage lending and rental housing preferred
D ata Modeler
Hybrid Richmond, VA
Long term contract
Design enterprise data models (conceptual, logical, and physical) that enable trusted, reusable, and secure data across the organization.
" Partners effectively with governance, engineering, architecture, analytics, security, and development teams.
" Understands and proliferates modeling standards and patterns aligned to EDM, Data Governance, and industry practices.
" Ensures data models support quality, lineage, privacy, and performance requirements.
" Ensures models are implementable and optimize cost/performance in target platforms.
" Ability to translate complex data concepts into clear business language; produce high-quality documentation and diagrams.
" Keen Business Acumen: Quickly understands domain processes and how data supports decisions.
" Critical/System Thinking: Able to see cross-domain dependencies and design for reuse and extensibility.
" Has a keen attention to detail.
" Intermediate/Advanced Microsoft Office Suite of Tools
" Intermediate/Advanced Data Modeling Skills -- Conceptual, Logical, Physical; Relational (3NF), Dimensional (star/snowflake), Data Vault 2.0, wide tables for lake/ELT, and canonical models for integration.
" Patterns & Techniques: Normalization/denormalization, SCD Type 1/2/3, surrogate keys, conformed dimensions, bridge/junk/helper tables, CDC handling, late/early arriving facts.
" Metadata & Catalog: Business glossary, data dictionary, lineage; catalog tools integration.
" Modeling Standards: Naming conventions, data types, constraints, keys, relationships, and versioning.
" Reference & Master Data: Hierarchies, code sets, golden records; stewardship workflows.
" Modeling Tools: Lucid Chart primary or ER/Studio/ERwin secondary
" Databases (OLTP/OLAP): SQL Server, Oracle, PostgreSQL, Snowflake, Azure Synapse, BigQuery, Redshift.
" Data Lake & Files: Parquet, ORC, Delta Lake; partitioning, Z-ordering, clustering.
" Cloud & Integration: Azure (Synapse, Fabric, SQL DB, Data Factory, Purview)
" BI/Semantic Layer: Power BI, semantic models, measures, and calculation groups (alignment to the logical model).
" Profiling & Quality: DQ; profiling and rule authoring.
" Security/Privacy: Row-/column-level security, dynamic masking, tokenization/encryption, privacy impact assessment alignment.
" SQL (intermediate): Window functions, CTEs, query optimization, index strategies, execution plans.
" Minimum 3-5 years of Data Modeling experience with enterprise scale systems
" Minimum 1-2 years working in MS Azure environments and SQL databases (jSynapse, SQL DB, Data Lake)
" Minimum 1-2 years of working with Power BI semantic modeling and KPI alignment
" Minimum 3-5 years of working with SQL
" Experience in banking, mortgage lending and rental housing preferred





