

IntePros
Snowflake Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake Engineer (Contract) focused on enhancing a Snowflake data warehouse in the mortgage and real estate domain. Requires 2+ years of Snowflake experience, strong SQL and Python skills, and familiarity with AWS. 100% remote.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
January 8, 2026
π - Duration
Unknown
-
ποΈ - Location
Remote
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
King of Prussia, PA
-
π§ - Skills detailed
#"ETL (Extract #Transform #Load)" #Data Processing #DevOps #Data Warehouse #Cloud #Python #Data Quality #Snowflake #SQL (Structured Query Language) #Data Engineering #S3 (Amazon Simple Storage Service) #Data Pipeline #Automation #Data Orchestration #AWS (Amazon Web Services) #Compliance #Scala #Deployment #Datasets
Role description
Snowflake Data Engineer (Contract)
Location: 100% Remote (U.S. only)
Work Hours: EST preferred
Engagement: Contract role only
Overview
We are seeking a highly skilled Snowflake Data Engineer to support and enhance a Snowflake-based data warehouse environment within the mortgage and real estate domain. This role is heavily focused on Snowflake, SQL, and Python, with an emphasis on orchestrating ELT workflows directly within the Snowflake platform.
The ideal candidate will have hands-on experience building and optimizing Snowflake pipelines using streams, dynamic tables, stored procedures, and DAG-based orchestration, while working with complex, imperfect datasets in a fast-moving, production environment.
Key Responsibilities
β’ Support and enhance a Snowflake data warehouse, with a strong focus on performance, scalability, and reliability
β’ Design and maintain ELT pipelines that move data across multiple layers (raw, processed, curated) within Snowflake
β’ Develop and optimize heavy SQL workloads, including complex transformations, tuning, and optimization
β’ Implement and manage Snowflake streams, dynamic tables, tasks, and stored procedures
β’ Use Python to orchestrate workflows, build DAGs, and automate data processes within and around Snowflake
β’ Process inbound data files, converting formats (e.g., flat files β Parquet) and loading data into S3 and Snowflake
β’ Support application-side data needs, including building data pipelines for a pricing engine integrated with third-party suppliers
β’ Handle imperfect, incomplete, or incorrect data, improving data quality, validation, and processing logic
β’ Partner closely with data engineers, application engineers, and other technical teams to deliver scalable solutions
β’ Contribute to DevOps and deployment practices related to data pipelines and Snowflake environments
β’ Document data flows, transformations, and Snowflake use cases
Required Qualifications
β’ 2+ years of hands-on experience working with Snowflake in a production environment
β’ Strong expertise in SQL, including query optimization and complex transformations within Snowflake
β’ Proficiency in Python for data processing, orchestration, and workflow automation
β’ Experience building and managing DAGs for data orchestration
β’ Deep understanding of Snowflake architecture and features, including streams, dynamic tables, tasks, and stored procedures
β’ Experience implementing ELT patterns within cloud-based data platforms
β’ Familiarity with AWS, particularly S3 and file-based ingestion workflows
β’ Exposure to DevOps practices in data engineering environments
β’ Strong communication skills and the ability to collaborate effectively with technical teams
Nice to Have
β’ Experience working with mortgage, lending, or real estate data
β’ Familiarity with financial services data models, compliance, or regulatory considerations
β’ Experience supporting application-facing data pipelines
Snowflake Data Engineer (Contract)
Location: 100% Remote (U.S. only)
Work Hours: EST preferred
Engagement: Contract role only
Overview
We are seeking a highly skilled Snowflake Data Engineer to support and enhance a Snowflake-based data warehouse environment within the mortgage and real estate domain. This role is heavily focused on Snowflake, SQL, and Python, with an emphasis on orchestrating ELT workflows directly within the Snowflake platform.
The ideal candidate will have hands-on experience building and optimizing Snowflake pipelines using streams, dynamic tables, stored procedures, and DAG-based orchestration, while working with complex, imperfect datasets in a fast-moving, production environment.
Key Responsibilities
β’ Support and enhance a Snowflake data warehouse, with a strong focus on performance, scalability, and reliability
β’ Design and maintain ELT pipelines that move data across multiple layers (raw, processed, curated) within Snowflake
β’ Develop and optimize heavy SQL workloads, including complex transformations, tuning, and optimization
β’ Implement and manage Snowflake streams, dynamic tables, tasks, and stored procedures
β’ Use Python to orchestrate workflows, build DAGs, and automate data processes within and around Snowflake
β’ Process inbound data files, converting formats (e.g., flat files β Parquet) and loading data into S3 and Snowflake
β’ Support application-side data needs, including building data pipelines for a pricing engine integrated with third-party suppliers
β’ Handle imperfect, incomplete, or incorrect data, improving data quality, validation, and processing logic
β’ Partner closely with data engineers, application engineers, and other technical teams to deliver scalable solutions
β’ Contribute to DevOps and deployment practices related to data pipelines and Snowflake environments
β’ Document data flows, transformations, and Snowflake use cases
Required Qualifications
β’ 2+ years of hands-on experience working with Snowflake in a production environment
β’ Strong expertise in SQL, including query optimization and complex transformations within Snowflake
β’ Proficiency in Python for data processing, orchestration, and workflow automation
β’ Experience building and managing DAGs for data orchestration
β’ Deep understanding of Snowflake architecture and features, including streams, dynamic tables, tasks, and stored procedures
β’ Experience implementing ELT patterns within cloud-based data platforms
β’ Familiarity with AWS, particularly S3 and file-based ingestion workflows
β’ Exposure to DevOps practices in data engineering environments
β’ Strong communication skills and the ability to collaborate effectively with technical teams
Nice to Have
β’ Experience working with mortgage, lending, or real estate data
β’ Familiarity with financial services data models, compliance, or regulatory considerations
β’ Experience supporting application-facing data pipelines






