

Holistic Partners, Inc
Data Warehouse Specialist
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Warehouse Specialist for 12+ months in Minnetonka, MN, and Madison, WI (Hybrid). Requires strong DW/Data Engineering experience in Healthcare, expertise in Snowflake, and proficiency in modern data tools like Kafka and Azure Data Factory.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
January 13, 2026
π - Duration
More than 6 months
-
ποΈ - Location
Hybrid
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Minnetonka, MN
-
π§ - Skills detailed
#Azure #Apache Airflow #Kafka (Apache Kafka) #Informatica #ADF (Azure Data Factory) #Data Ingestion #Cloud #Snowflake #Azure Data Factory #dbt (data build tool) #Metadata #Azure ADLS (Azure Data Lake Storage) #Oracle GoldenGate #Airflow #Data Warehouse #Spark (Apache Spark) #PySpark #Apache Iceberg #Data Quality #Storage #Data Accuracy #Oracle #Data Lake #Data Analysis #"ETL (Extract #Transform #Load)" #Data Engineering #ADLS (Azure Data Lake Storage)
Role description
Job Title: DW Engineer
Location: Minnetonka, MN, Madison, WI (Hybrid)
Duration: 12+ Months
Interview: Video
Tax Terms: C2C (Own Corp & Sub Vendors Allowed)
Position Summary:
Client is seeking an experienced Data Warehouse (DW) Engineer with a strong Data Analyst mindset and modern data engineering background to support their next-generation ingestion and analytics framework. This role emphasizes quality engineering, analytical thinking, and modern tool expertise β not generalist βcan-do-everythingβ profiles.
This is a high-impact role contributing to Medicaβs modern data platform transformation using Snowflake and cloud-native technologies.
Key Responsibilities:
Design, develop, and optimize modern data ingestion and transformation pipelines
Build and maintain high-quality analytical data models in Snowflake
Implement data quality, validation, and governance frameworks
Support metadata and pipeline management
Work closely with data analysts, product teams, and business stakeholders
Ensure data accuracy, reliability, and performance across the platform
Core Environment:
Snowflake
Informatica
Modern Technology Stack (Must Have ~70% or More):
Kafka
Oracle GoldenGate
Azure Data Lake Storage (ADLS Gen2)
Apache Iceberg
DBT (Snowflake)
Informatica IDMC
PySpark (Snowflake / Microsoft Fabric)
Microsoft Fabric
Azure Data Factory (ADF)
Apache Airflow
Metadata & Pipeline Management frameworks
Required Qualifications:
Strong DW / Data Engineering experience in Healthcare environments
Solid Data Analyst background
Strong focus on data quality, governance, and analytical modeling
Proven experience working with modern cloud data platforms
Excellent problem-solving and critical-thinking skills
Preferred Skills:
Data Quality frameworks
Metadata-driven pipelines
Experience modernizing legacy ingestion frameworks
Snowflake performance optimization
Job Title: DW Engineer
Location: Minnetonka, MN, Madison, WI (Hybrid)
Duration: 12+ Months
Interview: Video
Tax Terms: C2C (Own Corp & Sub Vendors Allowed)
Position Summary:
Client is seeking an experienced Data Warehouse (DW) Engineer with a strong Data Analyst mindset and modern data engineering background to support their next-generation ingestion and analytics framework. This role emphasizes quality engineering, analytical thinking, and modern tool expertise β not generalist βcan-do-everythingβ profiles.
This is a high-impact role contributing to Medicaβs modern data platform transformation using Snowflake and cloud-native technologies.
Key Responsibilities:
Design, develop, and optimize modern data ingestion and transformation pipelines
Build and maintain high-quality analytical data models in Snowflake
Implement data quality, validation, and governance frameworks
Support metadata and pipeline management
Work closely with data analysts, product teams, and business stakeholders
Ensure data accuracy, reliability, and performance across the platform
Core Environment:
Snowflake
Informatica
Modern Technology Stack (Must Have ~70% or More):
Kafka
Oracle GoldenGate
Azure Data Lake Storage (ADLS Gen2)
Apache Iceberg
DBT (Snowflake)
Informatica IDMC
PySpark (Snowflake / Microsoft Fabric)
Microsoft Fabric
Azure Data Factory (ADF)
Apache Airflow
Metadata & Pipeline Management frameworks
Required Qualifications:
Strong DW / Data Engineering experience in Healthcare environments
Solid Data Analyst background
Strong focus on data quality, governance, and analytical modeling
Proven experience working with modern cloud data platforms
Excellent problem-solving and critical-thinking skills
Preferred Skills:
Data Quality frameworks
Metadata-driven pipelines
Experience modernizing legacy ingestion frameworks
Snowflake performance optimization






