

Curate Partners
Senior Data Engineer - Snowflake
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer specializing in Snowflake, with a contract length of "unknown" and a pay rate of "unknown." Required skills include strong Snowflake SQL, Python, AWS experience, and familiarity with web analytics data in financial services.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 14, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Agile #Scripting #Azure Virtual Desktop #Documentation #AWS (Amazon Web Services) #Datasets #Azure #Data Architecture #Migration #Logging #Data Engineering #Cloud #Data Modeling #SQL (Structured Query Language) #Python #Snowflake #Monitoring #Scala
Role description
Senior Data Engineer (Snowflake)
•
•
• Must be strong with Snowflake, web analytics data and transactional financial data
•
•
• Seeking a Senior Data Engineer who is very strong with Snowflake to join a high-impact delivery pod supporting a major client in financial services.
The engagement centers on a critical data source migration — transitioning from a legacy internal data platform to a new vendor-sourced digital/web data feed. This role emphasizes scalable engineering, data modeling best practices, and operational reliability within the Snowflake ecosystem.
Key Responsibilities
Pipeline & Data Model Development
Design and implement Snowflake-based ingestion and transformation pipelines to support the data source migration
Build and optimize fact and dimension models for analytical consumption in Snowflake's data warehousing environment
Implement sessionization and event-based modeling patterns
Support the gap analysis between legacy and new-source datasets to ensure no data elements are lost during the transition
Engineering Excellence
Apply best practices for testing, monitoring, logging, and cost optimization
Partner with analysts on data contracts and source-to-target mappings
Contribute reusable patterns and technical documentation
Leverage Python for orchestration, transformation, and scripting tasks
Work within AWS cloud-native architectures and CI/CD practices
Collaboration & Delivery
Collaborate closely with pod members, client stakeholders, and subject matter experts
Participate in sprint planning, retrospectives, and daily stand-ups — integrating into the client's existing Agile cadence
Support a phased delivery approach across multiple workstreams
Required Experience & Skills
6–10+ years of data engineering experience
Strong Snowflake SQL and analytical data modeling expertise — specifically building facts and dimensions in a data warehousing context
Hands-on experience with Python-based orchestration or transformation frameworks
AWS / Cloud — foundational cloud experience is essential
Very strong SQL skills — this is a baseline expectation
Familiarity with cloud-native data architectures and CI/CD practices
Strong problem-solving and collaboration skills
Experience with digital interaction / web analytics data (e.g., Adobe Analytics or similar platforms) — highly preferred, as the engagement involves working with vendor-sourced digital data
Experience in large-scale or regulated enterprise environments, particularly financial services
Familiarity with Azure Virtual Desktop environments
Senior Data Engineer (Snowflake)
•
•
• Must be strong with Snowflake, web analytics data and transactional financial data
•
•
• Seeking a Senior Data Engineer who is very strong with Snowflake to join a high-impact delivery pod supporting a major client in financial services.
The engagement centers on a critical data source migration — transitioning from a legacy internal data platform to a new vendor-sourced digital/web data feed. This role emphasizes scalable engineering, data modeling best practices, and operational reliability within the Snowflake ecosystem.
Key Responsibilities
Pipeline & Data Model Development
Design and implement Snowflake-based ingestion and transformation pipelines to support the data source migration
Build and optimize fact and dimension models for analytical consumption in Snowflake's data warehousing environment
Implement sessionization and event-based modeling patterns
Support the gap analysis between legacy and new-source datasets to ensure no data elements are lost during the transition
Engineering Excellence
Apply best practices for testing, monitoring, logging, and cost optimization
Partner with analysts on data contracts and source-to-target mappings
Contribute reusable patterns and technical documentation
Leverage Python for orchestration, transformation, and scripting tasks
Work within AWS cloud-native architectures and CI/CD practices
Collaboration & Delivery
Collaborate closely with pod members, client stakeholders, and subject matter experts
Participate in sprint planning, retrospectives, and daily stand-ups — integrating into the client's existing Agile cadence
Support a phased delivery approach across multiple workstreams
Required Experience & Skills
6–10+ years of data engineering experience
Strong Snowflake SQL and analytical data modeling expertise — specifically building facts and dimensions in a data warehousing context
Hands-on experience with Python-based orchestration or transformation frameworks
AWS / Cloud — foundational cloud experience is essential
Very strong SQL skills — this is a baseline expectation
Familiarity with cloud-native data architectures and CI/CD practices
Strong problem-solving and collaboration skills
Experience with digital interaction / web analytics data (e.g., Adobe Analytics or similar platforms) — highly preferred, as the engagement involves working with vendor-sourced digital data
Experience in large-scale or regulated enterprise environments, particularly financial services
Familiarity with Azure Virtual Desktop environments






