

Snowflake Developer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake Developer with 8-12 years of experience, offering a 6+ month remote contract. Key skills include Snowflake, Python, SQL, and DBT. Work authorization is limited to Green Card holders, US Citizens, and H1 visa holders.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
August 30, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Snowpark #Data Engineering #Python #SQL Queries #SQL (Structured Query Language) #dbt (data build tool) #Documentation #Snowflake #Data Pipeline #Scala #"ETL (Extract #Transform #Load)" #Data Modeling
Role description
Position: Snowflake Developer
Location: Remote
Duration: 6+ Months Contract
Year of experience: 8-12 years
Work Authorization: Green Card, US Citizen and H1 only
Must have skills
Snowflake, Python, Snowpark, SQL and DBT knowledge nice to have
We are seeking a skilled Data Engineer with hands-on experience in Snowflake, Python, SQL, DBT, and Snowpark to design, build, and optimize scalable data pipelines and analytics solutions.
Design, develop, and optimize data pipelines and ELT/ETL workflows using Snowflake, Python, and DBT.
Build and maintain DBT models (sources, staging, fact/dim models, incremental loads, testing, and documentation).
Work with Snowpark to implement advanced transformations and enable data applications in Snowflake.
Write efficient SQL queries for complex transformations, performance optimization, and data modeling.
Position: Snowflake Developer
Location: Remote
Duration: 6+ Months Contract
Year of experience: 8-12 years
Work Authorization: Green Card, US Citizen and H1 only
Must have skills
Snowflake, Python, Snowpark, SQL and DBT knowledge nice to have
We are seeking a skilled Data Engineer with hands-on experience in Snowflake, Python, SQL, DBT, and Snowpark to design, build, and optimize scalable data pipelines and analytics solutions.
Design, develop, and optimize data pipelines and ELT/ETL workflows using Snowflake, Python, and DBT.
Build and maintain DBT models (sources, staging, fact/dim models, incremental loads, testing, and documentation).
Work with Snowpark to implement advanced transformations and enable data applications in Snowflake.
Write efficient SQL queries for complex transformations, performance optimization, and data modeling.