

PSI (Proteam Solutions)
Data Modeler
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Modeler with a contract length of "unknown," offering a pay rate of "unknown." The position requires 10+ years of experience in data modeling and architecture, expertise in Google BigQuery and Databricks, and proficiency in SQL and data modeling tools.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
776
-
🗓️ - Date
January 16, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Columbus, Ohio Metropolitan Area
-
🧠 - Skills detailed
#Data Governance #Compliance #Normalization #Databricks #Data Pipeline #Security #Data Architecture #Data Security #MDM (Master Data Management) #Data Engineering #Data Warehouse #"ETL (Extract #Transform #Load)" #Scala #Snowflake #BigQuery #Data Ingestion #SQL (Structured Query Language) #Data Integrity #ERWin #EDW (Enterprise Data Warehouse) #Physical Data Model #Data Storage #Storage #Data Lake #Data Modeling #Cloud #Data Management #Data Lakehouse #Metadata
Role description
Role Overview
We are seeking a highly experienced Data Modeler to design and implement enterprise-grade data solutions. This role requires deep expertise in data modeling principles, data architecture frameworks, and modern cloud-based platforms. The candidate will play a critical role in shaping our data ecosystem, ensuring scalability, performance, and alignment with business needs.
Key Responsibilities
• Data Modeling:
• Design conceptual, logical, and physical data models for structured and semi-structured data.
• Define entity relationships, normalization/denormalization strategies, and dimensional modeling for analytics.
• Develop star and snowflake schemas for data warehouses and analytical workloads.
• Ensure data integrity, consistency, and compliance with governance standards.
• Data Architecture:
• Architect end-to-end data solutions leveraging modern frameworks and best practices.
• Implement Medallion architecture (Bronze, Silver, Gold layers) for scalable data pipelines.
• Optimize data storage and retrieval strategies for performance and cost efficiency.
• Collaborate with engineering teams to design data ingestion, transformation, and integration workflows.
• Technology Stack:
• Hands-on experience with Google BigQuery for large-scale analytics.
• Expertise in Databricks for data engineering and advanced analytics.
• Familiarity with cloud-native architectures, distributed systems, and data lakehouse concepts.
Required Qualifications
• 10+ years of experience in data modeling and data architecture.
• Proven track record in designing enterprise data warehouses and analytical platforms.
• Strong understanding of Medallion architecture and modern data lakehouse design.
• Proficiency in SQL, data modeling tools (e.g., ERwin, PowerDesigner), and ETL/ELT frameworks.
• Hands-on experience with Google BigQuery and Databricks.
Skills:
Preferred Skills
• Knowledge of data governance, metadata management, and master data management.
• Experience with performance tuning and query optimization in large-scale environments.
• Familiarity with data security and compliance standards.
Role Overview
We are seeking a highly experienced Data Modeler to design and implement enterprise-grade data solutions. This role requires deep expertise in data modeling principles, data architecture frameworks, and modern cloud-based platforms. The candidate will play a critical role in shaping our data ecosystem, ensuring scalability, performance, and alignment with business needs.
Key Responsibilities
• Data Modeling:
• Design conceptual, logical, and physical data models for structured and semi-structured data.
• Define entity relationships, normalization/denormalization strategies, and dimensional modeling for analytics.
• Develop star and snowflake schemas for data warehouses and analytical workloads.
• Ensure data integrity, consistency, and compliance with governance standards.
• Data Architecture:
• Architect end-to-end data solutions leveraging modern frameworks and best practices.
• Implement Medallion architecture (Bronze, Silver, Gold layers) for scalable data pipelines.
• Optimize data storage and retrieval strategies for performance and cost efficiency.
• Collaborate with engineering teams to design data ingestion, transformation, and integration workflows.
• Technology Stack:
• Hands-on experience with Google BigQuery for large-scale analytics.
• Expertise in Databricks for data engineering and advanced analytics.
• Familiarity with cloud-native architectures, distributed systems, and data lakehouse concepts.
Required Qualifications
• 10+ years of experience in data modeling and data architecture.
• Proven track record in designing enterprise data warehouses and analytical platforms.
• Strong understanding of Medallion architecture and modern data lakehouse design.
• Proficiency in SQL, data modeling tools (e.g., ERwin, PowerDesigner), and ETL/ELT frameworks.
• Hands-on experience with Google BigQuery and Databricks.
Skills:
Preferred Skills
• Knowledge of data governance, metadata management, and master data management.
• Experience with performance tuning and query optimization in large-scale environments.
• Familiarity with data security and compliance standards.






