

HMG AMERICA LLC
Data Modeler
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Lake/SR Data Modeler in the US Healthcare Payer domain, hybrid in Newtown Square, PA, with a contract length and pay rate unspecified. Requires 10+ years of experience, Azure expertise, and strong SQL skills.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
May 8, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
New Jersey, United States
-
🧠 - Skills detailed
#BI (Business Intelligence) #Delta Lake #Data Lake #Compliance #ML (Machine Learning) #Spark SQL #Spark (Apache Spark) #SQL (Structured Query Language) #ERWin #Data Profiling #Azure Databricks #Synapse #ADLS (Azure Data Lake Storage) #Vault #Azure #Documentation #Data Modeling #ADF (Azure Data Factory) #Clustering #Microsoft Power BI #Data Science #Data Engineering #Databricks
Role description
Hi,
Role: Data Lake / SR Data Modeler Architect (US Healthcare Payer Domain)
Location: Newtown Square, PA (Hybrid (3 days onsite must)
Work mode: Resources in EST are Preferred
Contract
Role Summary
Seeking a Data Lake/Lakehouse Data Modeler with deep hands-on experience building governed, secure, and high-performance data models on Azure for Healthcare use cases. The role will design logical and physical schemas across landing, curated, and serving layers to support Payer domain Analytics and governance reporting.
Key Responsibilities:
• Design logical and physical models across raw, curated, and consumption layers optimized for lake house patterns.
• Define canonical models and source-to-target mappings for member, provider, claims, prior auth etc
• Define retention, archival, encryption (CMK), and access controls using Azure Key Vault and Azure AD.
• Work with data engineers and platform teams to implement models in Databricks and Synapse with CI/CD and tests.
• Perform data profiling, validation, and iterative tuning to meet performance and SLAs for BI and ML consumers. Required Qualifications:
• Proven experience modeling Lakehouse or data lake solutions for US Healthcare Payers.
• Deep knowledge of US Payer domain with modeling members, provider and all types of claims.
• Hands-on experience with ADLS Gen2, Azure Databricks (Delta Lake).
• Practical experience with CDC, streaming, low-latency analytics, and schema evolution strategies.
• Familiarity with Unity Catalog, Azure AD, Key Vault, network isolation, and regulatory compliance controls.
Technical Skills:
• Delta Lake, Parquet/ORC, schema evolution, partitioning and Z-order/clustering strategies.
• Spark SQL, Databricks notebooks, or similar modeling frameworks, and strong SQL proficiency.
• Ingestion and orchestration with ADF, Tidel and Databricks Workflows.
• BI and analytics integration using Power BI, Synapse SQL, and Databricks SQL for served models.
• Strong stakeholder engagement across compliance, risk, finance, engineering, and data science teams.
• Clear documentation, model governance, and ability to present designs for audits and architecture reviews.
Experience & Education:
• 10 + years of experience in data modeling or data engineering with US Healthcare and Payer domain is a must have experience.
• 5 years of experience with different modeling tools like Erwin/ Embarcadero or ER Studio
• 5 years of experience in modeling various data domains in Payer domain.
• Azure or Databricks certifications is value added.
• Deliverables: Model artifacts and data dictionaries, source-to-target mappings, lineage for audits, and measurable performance and quality improvements.
Hi,
Role: Data Lake / SR Data Modeler Architect (US Healthcare Payer Domain)
Location: Newtown Square, PA (Hybrid (3 days onsite must)
Work mode: Resources in EST are Preferred
Contract
Role Summary
Seeking a Data Lake/Lakehouse Data Modeler with deep hands-on experience building governed, secure, and high-performance data models on Azure for Healthcare use cases. The role will design logical and physical schemas across landing, curated, and serving layers to support Payer domain Analytics and governance reporting.
Key Responsibilities:
• Design logical and physical models across raw, curated, and consumption layers optimized for lake house patterns.
• Define canonical models and source-to-target mappings for member, provider, claims, prior auth etc
• Define retention, archival, encryption (CMK), and access controls using Azure Key Vault and Azure AD.
• Work with data engineers and platform teams to implement models in Databricks and Synapse with CI/CD and tests.
• Perform data profiling, validation, and iterative tuning to meet performance and SLAs for BI and ML consumers. Required Qualifications:
• Proven experience modeling Lakehouse or data lake solutions for US Healthcare Payers.
• Deep knowledge of US Payer domain with modeling members, provider and all types of claims.
• Hands-on experience with ADLS Gen2, Azure Databricks (Delta Lake).
• Practical experience with CDC, streaming, low-latency analytics, and schema evolution strategies.
• Familiarity with Unity Catalog, Azure AD, Key Vault, network isolation, and regulatory compliance controls.
Technical Skills:
• Delta Lake, Parquet/ORC, schema evolution, partitioning and Z-order/clustering strategies.
• Spark SQL, Databricks notebooks, or similar modeling frameworks, and strong SQL proficiency.
• Ingestion and orchestration with ADF, Tidel and Databricks Workflows.
• BI and analytics integration using Power BI, Synapse SQL, and Databricks SQL for served models.
• Strong stakeholder engagement across compliance, risk, finance, engineering, and data science teams.
• Clear documentation, model governance, and ability to present designs for audits and architecture reviews.
Experience & Education:
• 10 + years of experience in data modeling or data engineering with US Healthcare and Payer domain is a must have experience.
• 5 years of experience with different modeling tools like Erwin/ Embarcadero or ER Studio
• 5 years of experience in modeling various data domains in Payer domain.
• Azure or Databricks certifications is value added.
• Deliverables: Model artifacts and data dictionaries, source-to-target mappings, lineage for audits, and measurable performance and quality improvements.





