

Snowflake Engineer (Onsite Role)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake Engineer (Onsite) with a contract length of "unknown" and a pay rate of "unknown." Requires 18+ years of experience, expertise in Snowflake architecture, SQL optimization, and familiarity with data governance in financial services.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
September 27, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Austin, TX
-
🧠 - Skills detailed
#Airflow #GCP (Google Cloud Platform) #Data Manipulation #Clustering #Snowflake #Migration #Data Integrity #Cloud #Storage #Scala #SQL (Structured Query Language) #Data Pipeline #Business Analysis #Data Governance #Data Lineage #Data Catalog #Azure #Python #Data Quality #Data Storage #Security #Automation #Data Engineering #XML (eXtensible Markup Language) #SnowPipe #.Net #Compliance #BigQuery #JSON (JavaScript Object Notation) #Data Modeling #Metadata #dbt (data build tool)
Role description
We are seeking an experienced Enterprise Snowflake Engineer to lead the design, development, and optimization of scalable data solutions using the Snowflake Data Cloud. This role involves working across teams to build robust data pipelines, ensure data integrity and security, and support enterprise-wide analytics and reporting initiatives.
Required Skills:
• Bachelor’s Degree or foreign equivalent, will consider work experience in lieu of a degree
• Overall, 18+ years of experience 10+ years of experience in data engineering or cloud data solutions on Azure, GCP etc.
• Role requires deep expertise in Snowflake architecture, SQL optimization, and data modeling, as well as hands-on experience with GCP BigQuery schemas, UDFs, and partitioning strategies.
• Engineer will be responsible for translating query logic, optimizing performance, and ensuring data integrity throughout the migration.
• Familiarity with orchestration tools (e.g., Airflow, dbt), secure data sharing, and compliance with financial data governance standards is essential.
• Design scalable data models optimized for Snowflake’s cloud-native architecture, including use of virtual warehouses, clustering keys, and materialized views.
• Collaborate with data engineers and architects to implement models that support real-time analytics, fraud detection, and risk management.
• Ensure data quality and consistency across diverse financial systems such as trading platforms, customer onboarding, and compliance tools.
• Integrate structured and semi-structured data (e.g., JSON, XML) using Snowflake’s native capabilities to support complex financial reporting.
• Document data lineage and metadata to support auditability and transparency for internal and external stakeholders.
• Optimize data storage and query performance using Snowflake-specific features like automatic clustering and query profiling.
• Support data governance initiatives by aligning models with enterprise data catalogs, stewardship policies, and access controls.
• Collaborate with business analysts and compliance teams to translate financial reporting needs into robust data structures.
• Continuously refine models based on evolving financial products, market conditions, and regulatory changes.
Preferred Skills:
• Certified SnowPro Advanced: Architect
• Proficiency in Python or .NET for data manipulation and automation.
• Experience with Snowflake features like Snowpipe, Streams, Tasks, and Time Travel.
• Understanding of data governance frameworks and compliance standards.
• Strong understanding of financial services workflows, including Collateral Management and Corporate Action.
• Excellent communication and stakeholder engagement skills.
We are seeking an experienced Enterprise Snowflake Engineer to lead the design, development, and optimization of scalable data solutions using the Snowflake Data Cloud. This role involves working across teams to build robust data pipelines, ensure data integrity and security, and support enterprise-wide analytics and reporting initiatives.
Required Skills:
• Bachelor’s Degree or foreign equivalent, will consider work experience in lieu of a degree
• Overall, 18+ years of experience 10+ years of experience in data engineering or cloud data solutions on Azure, GCP etc.
• Role requires deep expertise in Snowflake architecture, SQL optimization, and data modeling, as well as hands-on experience with GCP BigQuery schemas, UDFs, and partitioning strategies.
• Engineer will be responsible for translating query logic, optimizing performance, and ensuring data integrity throughout the migration.
• Familiarity with orchestration tools (e.g., Airflow, dbt), secure data sharing, and compliance with financial data governance standards is essential.
• Design scalable data models optimized for Snowflake’s cloud-native architecture, including use of virtual warehouses, clustering keys, and materialized views.
• Collaborate with data engineers and architects to implement models that support real-time analytics, fraud detection, and risk management.
• Ensure data quality and consistency across diverse financial systems such as trading platforms, customer onboarding, and compliance tools.
• Integrate structured and semi-structured data (e.g., JSON, XML) using Snowflake’s native capabilities to support complex financial reporting.
• Document data lineage and metadata to support auditability and transparency for internal and external stakeholders.
• Optimize data storage and query performance using Snowflake-specific features like automatic clustering and query profiling.
• Support data governance initiatives by aligning models with enterprise data catalogs, stewardship policies, and access controls.
• Collaborate with business analysts and compliance teams to translate financial reporting needs into robust data structures.
• Continuously refine models based on evolving financial products, market conditions, and regulatory changes.
Preferred Skills:
• Certified SnowPro Advanced: Architect
• Proficiency in Python or .NET for data manipulation and automation.
• Experience with Snowflake features like Snowpipe, Streams, Tasks, and Time Travel.
• Understanding of data governance frameworks and compliance standards.
• Strong understanding of financial services workflows, including Collateral Management and Corporate Action.
• Excellent communication and stakeholder engagement skills.