Hollstadt Consulting

Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer, hybrid in Oak Park Heights, MN, lasting 3-6 months. Pay ranges from $67.61-$70.55 per hour. Key skills include Snowflake, dbt, Fivetran, and Azure. Requires 8-10 years of relevant experience.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
560
-
🗓️ - Date
April 3, 2026
🕒 - Duration
3 to 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Stillwater, MN
-
🧠 - Skills detailed
#Data Architecture #"ETL (Extract #Transform #Load)" #Data Pipeline #Azure DevOps #DataOps #Automation #Classification #Code Reviews #Snowflake #Databases #Data Catalog #Azure #Leadership #dbt (data build tool) #Vault #Data Lineage #Scala #Data Vault #Clustering #Data Engineering #AI (Artificial Intelligence) #Data Privacy #Fivetran #ML (Machine Learning) #Security #Data Framework #Datasets #Metadata #Cloud #DevOps #Observability #Storage
Role description
Role: Senior Data Engineer Location: Hybrid in Oak Park Heights, MN (3 days onsite per week) Duration: 3-6 months with potential to convert Rate: $67.61-$70.55, dependent on skills and qualifications Tech Stack: Snowflake, dbt, Fivetran, Azure (ADO, Blob, Functions etc) Role Overview We are seeking a seasoned Senior level data Engineer to lead technical execution and provide architectural guidance across our onshore and offshore engineering teams. In this role, you will be the primary technical point of contact, ensuring that complex data requirements are translated into scalable, resilient, and highly optimized solutions. You will not only build but also influence the standards for a modern data stack centered on Snowflake and Azure, driving "Governance as Code" and operational excellence. Key Responsibilities 1. Technical Leadership & Cross-Shore Guidance Engineering Anchor: Act as the primary technical lead for distributed teams, ensuring clarity in requirements and maintaining high standards of execution across time zones. Architectural Blueprinting: Engage with Product Owners and Solution Architects to design optimal data product pipelines that serve as the foundational reference for the broader engineering team. Resilience Engineering: Design systems for high availability and fault tolerance, ensuring the data platform can recover gracefully from upstream failures. 1. Data Platform & Pipeline Engineering Modern Data Stack Mastery: Engineer and optimize full-lifecycle data pipelines using Fivetran, Snowflake, and dbt, focusing on large-scale, complex datasets. Metadata-Driven Automation: Design and implement config-driven or metadata-driven pipelines to increase development velocity and reduce manual overhead. Layered Frameworks: Apply advanced modeling techniques (Data Vault, Dimensional/Star Schema) to create high-performance, curated, reusable core datasets and purpose-built datasets optimized for analytics and AI. 1. Performance & Cost Optimization Snowflake Expert: Apply advanced proficiency in Snowflake performance tuning (clustering, warehouse profiling, query optimization) to minimize both latency and Azure consumption costs. End-to-End Efficiency: Monitor and tune the entire flow from ingestion to transformation to ensure the stack remains performant as data volumes scale. 1. Governance, Security & DataOps Governance as Code: Implement and validate automated data lineage, quality checks, and data classification within the CI/CD workflow. Observability & Health: Drive platform reliability by implementing end-to-end observability; proactively monitor data health and enforce rigorous quality gates using dbt. Azure Int-egration: Manage and optimize data flows within the Azure ecosystem, leveraging Azure DevOps (ADO), Blob Storage, and Azure Functions. Required Qualifications Experience: 8–10 years of experience building and optimizing large-scale, complex data architectures and pipelines. Core Stack: Expert-level command of Snowflake, dbt, and Fivetran. Cloud Infrastructure: Strong proficiency in Azure services (Storage, Compute, and DevOps/CI/CD). Modeling: Proven ability to engineer layered data frameworks using various modeling methodologies (e.g., Data Vault 2.0). Leadership: Experience guiding offshore teams and conducting technical code reviews to ensure consistency and adherence to patterns. Preferred "Good to Have" Skills AI/ML Enablement: Experience in building data foundations that enable Machine Learning and Generative AI use cases (e.g., Vector databases, feature stores). Advanced Governance: Experience with automated data privacy/masking and advanced metadata cataloging.