

Information Consulting Services
Snowflake Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake Engineer on a contract through 04/30/2026, offering 40 hours/week remote work. Key skills include expert Snowflake, Python, and advanced SQL. Requires 5+ years in data engineering and familiarity with cloud platforms.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
January 15, 2026
π - Duration
More than 6 months
-
ποΈ - Location
Remote
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Wayne, PA
-
π§ - Skills detailed
#Airflow #Data Integrity #Cloud #Security #Microsoft Power BI #BI (Business Intelligence) #AWS (Amazon Web Services) #Automation #Data Pipeline #Data Processing #Data Science #Tableau #"ETL (Extract #Transform #Load)" #Azure #Data Modeling #Python #Complex Queries #SQL (Structured Query Language) #Compliance #SQL Queries #Data Governance #Scala #Data Engineering #Data Quality #GCP (Google Cloud Platform) #Snowflake
Role description
We're hiring a Snowflake Engineer to join a data engineering team building and optimizing pipelines that power analytics and business intelligence. This is a hands-on role for someone strong in Snowflake, Python, and SQL, with a focus on scalable ETL, performance tuning, and reliable data delivery. Remote is OK (US-based).
Location / Work Setup
β’ Remote (US-based)
β’ 40 hours/week
β’ Contract through 04/30/2026
What You'll Do
β’ Design, implement, and maintain scalable data pipelines and ETL processes using Snowflake.
β’ Develop and optimize SQL queries and stored procedures for efficient processing.
β’ Build and maintain Python workflows for automation and integrations.
β’ Partner with analysts, data scientists, and stakeholders to deliver high-quality data solutions.
β’ Ensure data integrity, security, and compliance with relevant standards.
β’ Monitor, troubleshoot, and tune data systems for availability and performance.
β’ Document data models, pipelines, and best practices.
Required Qualifications
β’ 5+ years in data engineering (or closely related)
β’ Expert-level Snowflake (including performance tuning and advanced features)
β’ Strong Python for data processing/automation
β’ Advanced SQL (complex queries + optimization)
β’ Experience with data modeling, ETL frameworks, and data warehousing concepts
β’ Familiarity with at least one cloud platform (AWS, Azure, or GCP)
β’ Strong problem-solving and communication skills
Preferred / Nice-to-Have
β’ Experience with mortgage and loan data (lending processes and compliance concepts)
β’ Knowledge of data governance / data quality frameworks in financial services
β’ Exposure to BI tools (Tableau, Power BI) and orchestration tools (Airflow)
#ZR
We're hiring a Snowflake Engineer to join a data engineering team building and optimizing pipelines that power analytics and business intelligence. This is a hands-on role for someone strong in Snowflake, Python, and SQL, with a focus on scalable ETL, performance tuning, and reliable data delivery. Remote is OK (US-based).
Location / Work Setup
β’ Remote (US-based)
β’ 40 hours/week
β’ Contract through 04/30/2026
What You'll Do
β’ Design, implement, and maintain scalable data pipelines and ETL processes using Snowflake.
β’ Develop and optimize SQL queries and stored procedures for efficient processing.
β’ Build and maintain Python workflows for automation and integrations.
β’ Partner with analysts, data scientists, and stakeholders to deliver high-quality data solutions.
β’ Ensure data integrity, security, and compliance with relevant standards.
β’ Monitor, troubleshoot, and tune data systems for availability and performance.
β’ Document data models, pipelines, and best practices.
Required Qualifications
β’ 5+ years in data engineering (or closely related)
β’ Expert-level Snowflake (including performance tuning and advanced features)
β’ Strong Python for data processing/automation
β’ Advanced SQL (complex queries + optimization)
β’ Experience with data modeling, ETL frameworks, and data warehousing concepts
β’ Familiarity with at least one cloud platform (AWS, Azure, or GCP)
β’ Strong problem-solving and communication skills
Preferred / Nice-to-Have
β’ Experience with mortgage and loan data (lending processes and compliance concepts)
β’ Knowledge of data governance / data quality frameworks in financial services
β’ Exposure to BI tools (Tableau, Power BI) and orchestration tools (Airflow)
#ZR






