

Brooksource
Senior Data Modeler
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Modeler, offering a contract length of "unknown" and a pay rate of "unknown." Key skills include dimensional modeling, SQL proficiency, and experience with cloud platforms. A Bachelor's degree and 8+ years in data modeling are required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
720
-
🗓️ - Date
April 9, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Kansas City, MO
-
🧠 - Skills detailed
#Data Security #Classification #Security #Data Quality #Databricks #BigQuery #Data Strategy #GIT #Neo4J #Redshift #Batch #Data Vault #Data Modeling #Data Governance #Alation #Apache Kafka #Migration #Databases #ML (Machine Learning) #AWS (Amazon Web Services) #Automation #Knowledge Graph #Slowly Changing Dimensions #SQL (Structured Query Language) #SSAS (SQL Server Analysis Services) #Cloud #Leadership #Model Deployment #Data Integration #Data Architecture #AI (Artificial Intelligence) #dbt (data build tool) #Data Lineage #Scala #Snowflake #EDW (Enterprise Data Warehouse) #"ETL (Extract #Transform #Load)" #Vault #Azure Event Hubs #Data Science #Data Catalog #Synapse #Documentation #Graph Databases #Semantic Models #Strategy #Azure #Collibra #Data Engineering #Kafka (Apache Kafka) #Microsoft Power BI #Version Control #Amazon Neptune #GCP (Google Cloud Platform) #Data Warehouse #DAX #Computer Science #BI (Business Intelligence) #Deployment #Compliance
Role description
The Senior Data Modeler is responsible for designing, developing, and optimizing data models that power enterprise analytics, reporting, AI/ML initiatives, and data products. This role requires deep expertise in dimensional modeling, semantic layer design, and modern data architecture patterns across batch and streaming paradigms.
The ideal candidate combines strong technical data modeling skills with the ability to translate business requirements into scalable, well-governed data structures. You will partner closely with data engineers, analysts, data scientists, and business stakeholders to ensure data models are performant, maintainable, and aligned with enterprise data strategy.
Key Responsibilities
Data Modeling & Architecture
• Design and implement dimensional models (star and snowflake schemas), Data Vault 2.0 structures, and medallion architecture patterns (Bronze/Silver/Gold) to support enterprise analytics and reporting.
• Build and optimize semantic models using Power BI, dbt Semantic Layer, and other technologies to enable consistent, self-service analytics across the organization.
• Lead the migration of legacy SSAS tabular models to modern cloud-based semantic layers while establishing tool-agnostic metric definitions.
• Design event schemas and data models for streaming architectures, including Kafka topics, CDC pipelines, and schema registries (Avro/Protobuf), ensuring proper schema evolution strategies.
• Partner with data engineers to optimize table designs, partitioning strategies, and materialized views for query performance across cloud platforms (Databricks, Snowflake, BigQuery, Azure Synapse).
• Partner with data scientists to design ML-ready data models and feature stores, supporting feature engineering and training vs. serving patterns.
Data Governance & Quality
• Design data models with embedded governance, including ensuring data security at all levels, data lineage documentation, data quality rules, sensitivity classifications (PII/PHI), and business glossary alignment.
• Document and maintain data models, mappings, and business rules within enterprise data catalogs (Alation or similar), ensuring full traceability and impact analysis capabilities.
• Implement slowly changing dimension (SCD) patterns and historical tracking strategies appropriate to business requirements.
• Establish and enforce data modeling standards, naming conventions, and best practices across the data organization.
AI/ML & Analytics Enablement
• Create data products and semantic layers that enable business users to perform self-service analytics while maintaining governance and data consistency.
• Partner with data scientists to design feature stores and ML-ready data models, understanding feature engineering requirements and training vs. serving data patterns.
Leadership & Influence
• Serve as a subject matter expert and thought leader for data modeling and semantic layer design, mentoring team members and influencing architecture decisions.
• Contribute to enterprise data strategy by shaping how data products, metrics, and insights are standardized and scaled across the organization.
Collaboration & Operations
• Collaborate with stakeholders across business units to gather requirements, translate business needs into data model designs, and recommend solutions for new and existing data products.
• Mentor junior data modelers and data engineers on modeling methodologies, best practices, and modern data architecture patterns.
• Implement data models as code using dbt and version control (Git), establishing CI/CD pipelines for model testing and deployment.
• Ensure operational reliability of data models, address data quality issues promptly, and work with technical and non-technical users to troubleshoot and resolve data discrepancies.
• Research and recommend new tools, methods, or technologies to continuously improve data modeling practices and data platform capabilities.
Required Qualifications
Education & Experience
• Bachelor's degree in Computer Science, Information Systems, Data Science, Engineering, or related field.
• 8+ years of experience in data modeling with demonstrated expertise in enterprise data warehouse and analytics environments.
Technical Skills
• Deep expertise in dimensional modeling methodologies (Kimball star/snowflake schemas, conformed dimensions, slowly changing dimensions) and familiarity with Data Vault 2.0 or normalized modeling approaches.
• Proactive mindset in championing and establishing AI agent workflows within data engineering, reporting, and AI/ML initiatives to drive automation and intelligent data operations.
• Advanced proficiency in SQL (including modern dialects for Snowflake, BigQuery, Databricks SQL), DAX, and Power Query (M).
• Strong experience with semantic layer platforms, including Power BI Desktop/Service, and familiarity with dbt Semantic Layer, LookML, or similar.
• Hands-on experience with cloud data platforms: Databricks, Snowflake, Google BigQuery, Azure Synapse, or AWS Redshift.
• Solid understanding of ELT/ETL patterns, change data capture (CDC), and data integration best practices.
• Experience with data catalogs and lineage tools (Alation, Collibra, Atlan, or similar) for documentation and governance.
• Proficiency with Git-based version control and familiarity with CI/CD concepts for data model deployment.
Professional Skills
• Strong analytical and problem-solving abilities with excellent attention to detail.
• Excellent verbal and written communication skills with the ability to explain complex data concepts to both technical and non-technical audiences.
• Proven ability to collaborate across cross-functional teams and facilitate requirements-gathering sessions.
• Experience mentoring junior team members and establishing team standards.
Preferred Qualifications
• Experience with event-driven architecture and streaming data modeling (Apache Kafka, Confluent, Azure Event Hubs) including schema registries and schema evolution patterns.
• Experience with dbt (data build tool) for transformation modeling and testing.
• Familiarity with typical corporate business terms and data used in Finance, Human Resources, Marketing, etc.
• Familiarity with feature engineering concepts, feature stores, and designing data models for machine learning workloads.
• Understanding of data mesh principles, data products, and domain-oriented data ownership.
• Experience with graph databases (Neo4j, Amazon Neptune) or knowledge graph modeling.
• Knowledge of OpenLineage or similar open standards for data lineage.
• Background in AEC (Architecture, Engineering, Construction) or related industries.
• Relevant certifications in cloud platforms (Azure, GCP, AWS) or data modeling.
Work Environment
• Compliance with company and site safety policies is required.
• Adherence to QA/QC standards and data governance policies.
• Additional duties may be assigned as business needs evolve.
The Senior Data Modeler is responsible for designing, developing, and optimizing data models that power enterprise analytics, reporting, AI/ML initiatives, and data products. This role requires deep expertise in dimensional modeling, semantic layer design, and modern data architecture patterns across batch and streaming paradigms.
The ideal candidate combines strong technical data modeling skills with the ability to translate business requirements into scalable, well-governed data structures. You will partner closely with data engineers, analysts, data scientists, and business stakeholders to ensure data models are performant, maintainable, and aligned with enterprise data strategy.
Key Responsibilities
Data Modeling & Architecture
• Design and implement dimensional models (star and snowflake schemas), Data Vault 2.0 structures, and medallion architecture patterns (Bronze/Silver/Gold) to support enterprise analytics and reporting.
• Build and optimize semantic models using Power BI, dbt Semantic Layer, and other technologies to enable consistent, self-service analytics across the organization.
• Lead the migration of legacy SSAS tabular models to modern cloud-based semantic layers while establishing tool-agnostic metric definitions.
• Design event schemas and data models for streaming architectures, including Kafka topics, CDC pipelines, and schema registries (Avro/Protobuf), ensuring proper schema evolution strategies.
• Partner with data engineers to optimize table designs, partitioning strategies, and materialized views for query performance across cloud platforms (Databricks, Snowflake, BigQuery, Azure Synapse).
• Partner with data scientists to design ML-ready data models and feature stores, supporting feature engineering and training vs. serving patterns.
Data Governance & Quality
• Design data models with embedded governance, including ensuring data security at all levels, data lineage documentation, data quality rules, sensitivity classifications (PII/PHI), and business glossary alignment.
• Document and maintain data models, mappings, and business rules within enterprise data catalogs (Alation or similar), ensuring full traceability and impact analysis capabilities.
• Implement slowly changing dimension (SCD) patterns and historical tracking strategies appropriate to business requirements.
• Establish and enforce data modeling standards, naming conventions, and best practices across the data organization.
AI/ML & Analytics Enablement
• Create data products and semantic layers that enable business users to perform self-service analytics while maintaining governance and data consistency.
• Partner with data scientists to design feature stores and ML-ready data models, understanding feature engineering requirements and training vs. serving data patterns.
Leadership & Influence
• Serve as a subject matter expert and thought leader for data modeling and semantic layer design, mentoring team members and influencing architecture decisions.
• Contribute to enterprise data strategy by shaping how data products, metrics, and insights are standardized and scaled across the organization.
Collaboration & Operations
• Collaborate with stakeholders across business units to gather requirements, translate business needs into data model designs, and recommend solutions for new and existing data products.
• Mentor junior data modelers and data engineers on modeling methodologies, best practices, and modern data architecture patterns.
• Implement data models as code using dbt and version control (Git), establishing CI/CD pipelines for model testing and deployment.
• Ensure operational reliability of data models, address data quality issues promptly, and work with technical and non-technical users to troubleshoot and resolve data discrepancies.
• Research and recommend new tools, methods, or technologies to continuously improve data modeling practices and data platform capabilities.
Required Qualifications
Education & Experience
• Bachelor's degree in Computer Science, Information Systems, Data Science, Engineering, or related field.
• 8+ years of experience in data modeling with demonstrated expertise in enterprise data warehouse and analytics environments.
Technical Skills
• Deep expertise in dimensional modeling methodologies (Kimball star/snowflake schemas, conformed dimensions, slowly changing dimensions) and familiarity with Data Vault 2.0 or normalized modeling approaches.
• Proactive mindset in championing and establishing AI agent workflows within data engineering, reporting, and AI/ML initiatives to drive automation and intelligent data operations.
• Advanced proficiency in SQL (including modern dialects for Snowflake, BigQuery, Databricks SQL), DAX, and Power Query (M).
• Strong experience with semantic layer platforms, including Power BI Desktop/Service, and familiarity with dbt Semantic Layer, LookML, or similar.
• Hands-on experience with cloud data platforms: Databricks, Snowflake, Google BigQuery, Azure Synapse, or AWS Redshift.
• Solid understanding of ELT/ETL patterns, change data capture (CDC), and data integration best practices.
• Experience with data catalogs and lineage tools (Alation, Collibra, Atlan, or similar) for documentation and governance.
• Proficiency with Git-based version control and familiarity with CI/CD concepts for data model deployment.
Professional Skills
• Strong analytical and problem-solving abilities with excellent attention to detail.
• Excellent verbal and written communication skills with the ability to explain complex data concepts to both technical and non-technical audiences.
• Proven ability to collaborate across cross-functional teams and facilitate requirements-gathering sessions.
• Experience mentoring junior team members and establishing team standards.
Preferred Qualifications
• Experience with event-driven architecture and streaming data modeling (Apache Kafka, Confluent, Azure Event Hubs) including schema registries and schema evolution patterns.
• Experience with dbt (data build tool) for transformation modeling and testing.
• Familiarity with typical corporate business terms and data used in Finance, Human Resources, Marketing, etc.
• Familiarity with feature engineering concepts, feature stores, and designing data models for machine learning workloads.
• Understanding of data mesh principles, data products, and domain-oriented data ownership.
• Experience with graph databases (Neo4j, Amazon Neptune) or knowledge graph modeling.
• Knowledge of OpenLineage or similar open standards for data lineage.
• Background in AEC (Architecture, Engineering, Construction) or related industries.
• Relevant certifications in cloud platforms (Azure, GCP, AWS) or data modeling.
Work Environment
• Compliance with company and site safety policies is required.
• Adherence to QA/QC standards and data governance policies.
• Additional duties may be assigned as business needs evolve.






