

Aptonet Inc
Senior Snowflake Data Warehouse Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Snowflake Data Warehouse Engineer with a 6-month contract, offering competitive pay. Key skills include Snowflake, advanced SQL, ELT frameworks (dbt), and experience with financial data concepts. Remote work is available.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 10, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Wisconsin, United States
-
🧠 - Skills detailed
#Data Quality #Datasets #Snowflake #Monitoring #dbt (data build tool) #BI (Business Intelligence) #Cloud #Storage #Data Modeling #SQL (Structured Query Language) #Clustering #Scala #Data Warehouse #GitLab #Version Control #Databases #Data Architecture #Azure cloud #Data Science #Azure #Data Processing #Strategy #Data Engineering #Deployment #"ETL (Extract #Transform #Load)" #SnowPipe #Slowly Changing Dimensions #Security #Risk Analysis
Role description
Senior Snowflake Data Warehouse Engineer
We are seeking a Senior Snowflake Data Warehouse Engineer with strong data engineering expertise to design and deliver scalable data warehouse solutions supporting advanced analytics and financial reporting workloads. This role will focus on building high-performance data platforms that enable portfolio analytics, risk analysis, and enterprise reporting.
The ideal candidate has experience designing enterprise-grade analytical platforms, implementing robust change data capture (CDC) frameworks, and supporting performance-sensitive analytical workloads in production environments. This position will collaborate closely with data architects, data scientists, and business intelligence teams to develop reliable, governed, and performant data solutions.
Key Responsibilities
• Design and implement Snowflake schemas and object lifecycle strategies optimized for analytical workloads.
• Develop scalable dimensional and time-series data models supporting portfolio hierarchies, positions, security master integration, exposures, and risk metrics.
• Build and maintain ELT pipelines using dbt and native Snowflake capabilities, including Streams, Tasks, and Snowpipe, to support daily and intraday data processing.
• Implement change data capture (CDC) and incremental data processing frameworks for position, pricing, risk, and reference data.
• Develop and maintain high-performance tables, views, and materialized views aligned with analytical workloads.
• Optimize query performance, including clustering strategy design, micro-partition optimization, warehouse sizing, caching strategies, and workload management.
• Establish data quality frameworks, including reconciliation checks, completeness validation, monitoring, and alerting for critical datasets.
• Automate Snowflake deployments and SQL transformations using GitLab CI/CD pipelines and version control best practices.
• Document data models, lineage, architecture decisions, governance policies, and operational procedures.
• Troubleshoot production issues, perform root cause analysis, and resolve problems impacting reporting timelines.
Required Qualifications
• 7+ years of experience in data engineering or data warehousing roles.
• 5+ years of hands-on Snowflake experience in production environments, including:
• Managing databases, schemas, roles, and access controls
• Designing and implementing Streams and Tasks for CDC and scheduled data processing
• Building and optimizing performance-driven data models and materialized views
• Advanced SQL expertise, including window functions, CTEs, analytic queries, set-based transformations, and query optimization.
• 4+ years of experience with ELT frameworks, preferably dbt.
• Experience integrating Snowflake with Azure cloud storage and upstream enterprise systems.
• Proven experience implementing CI/CD pipelines (GitLab) for SQL transformations and Snowflake object deployments.
• Strong knowledge of analytical data modeling, including star schemas, fact tables, slowly changing dimensions, aggregates, and large-scale time-series datasets.
• Experience supporting high-volume, performance-sensitive analytical datasets.
• Familiarity with financial data concepts, such as portfolio accounting, security master data, market data feeds, risk metrics, factor exposure, stress testing, or performance attribution.
Engagement Details
• Contract duration: Initial 6-month engagement with potential extensions.
Senior Snowflake Data Warehouse Engineer
We are seeking a Senior Snowflake Data Warehouse Engineer with strong data engineering expertise to design and deliver scalable data warehouse solutions supporting advanced analytics and financial reporting workloads. This role will focus on building high-performance data platforms that enable portfolio analytics, risk analysis, and enterprise reporting.
The ideal candidate has experience designing enterprise-grade analytical platforms, implementing robust change data capture (CDC) frameworks, and supporting performance-sensitive analytical workloads in production environments. This position will collaborate closely with data architects, data scientists, and business intelligence teams to develop reliable, governed, and performant data solutions.
Key Responsibilities
• Design and implement Snowflake schemas and object lifecycle strategies optimized for analytical workloads.
• Develop scalable dimensional and time-series data models supporting portfolio hierarchies, positions, security master integration, exposures, and risk metrics.
• Build and maintain ELT pipelines using dbt and native Snowflake capabilities, including Streams, Tasks, and Snowpipe, to support daily and intraday data processing.
• Implement change data capture (CDC) and incremental data processing frameworks for position, pricing, risk, and reference data.
• Develop and maintain high-performance tables, views, and materialized views aligned with analytical workloads.
• Optimize query performance, including clustering strategy design, micro-partition optimization, warehouse sizing, caching strategies, and workload management.
• Establish data quality frameworks, including reconciliation checks, completeness validation, monitoring, and alerting for critical datasets.
• Automate Snowflake deployments and SQL transformations using GitLab CI/CD pipelines and version control best practices.
• Document data models, lineage, architecture decisions, governance policies, and operational procedures.
• Troubleshoot production issues, perform root cause analysis, and resolve problems impacting reporting timelines.
Required Qualifications
• 7+ years of experience in data engineering or data warehousing roles.
• 5+ years of hands-on Snowflake experience in production environments, including:
• Managing databases, schemas, roles, and access controls
• Designing and implementing Streams and Tasks for CDC and scheduled data processing
• Building and optimizing performance-driven data models and materialized views
• Advanced SQL expertise, including window functions, CTEs, analytic queries, set-based transformations, and query optimization.
• 4+ years of experience with ELT frameworks, preferably dbt.
• Experience integrating Snowflake with Azure cloud storage and upstream enterprise systems.
• Proven experience implementing CI/CD pipelines (GitLab) for SQL transformations and Snowflake object deployments.
• Strong knowledge of analytical data modeling, including star schemas, fact tables, slowly changing dimensions, aggregates, and large-scale time-series datasets.
• Experience supporting high-volume, performance-sensitive analytical datasets.
• Familiarity with financial data concepts, such as portfolio accounting, security master data, market data feeds, risk metrics, factor exposure, stress testing, or performance attribution.
Engagement Details
• Contract duration: Initial 6-month engagement with potential extensions.






