

Enterprise Solutions Inc.
Snowflake DBT Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake DBT Engineer on a contract basis for 6 months, paying $60-$65/hr. Located in New York (Hybrid 2-3 days), it requires 8+ years of data engineering experience, proficiency in Snowflake, dbt, SQL Server, PostgreSQL, and Python.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
520
-
ποΈ - Date
December 3, 2025
π - Duration
Unknown
-
ποΈ - Location
Hybrid
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
New York, NY
-
π§ - Skills detailed
#Data Engineering #Database Management #SQL Server #"ETL (Extract #Transform #Load)" #PostgreSQL #Azure #Azure Data Factory #Automation #Data Pipeline #SQL Queries #Data Science #Python #Documentation #Complex Queries #Scala #Airflow #SQL (Structured Query Language) #ADF (Azure Data Factory) #Data Analysis #Scripting #Data Governance #Data Integration #Data Manipulation #dbt (data build tool) #Databases #Snowflake
Role description
Title : Snowflake DBT Engineer
Location: New York (Hybrid 2-3 Days)
Type- Contract
Rate- $60/hr to $65/hr
Job Description
We are seeking a highly skilled Data Engineer with extensive experience in Snowflake Data Build Tool dbt Snaplogic SQL Server PostgreSQL Azure Data Factory and other ETL tools The ideal candidate will have a strong ability to optimize SQL queries and a good working knowledge of Python A positive attitude and excellent teamwork skills are essential
Primary Skill - Snowflake Airflow DBT SQL server and PostgreSQL
Key Responsibilities:
β’ Data Pipeline Development Design develop and maintain scalable data pipelines using Snowflake dbt Snaplogic and ETL tools
β’ SQL Optimization Write and optimize complex SQL queries to ensure high performance and efficiency
β’ Data Integration Integrate data from various sources ensuring consistency accuracy and reliability
β’ Database Management Manage and maintain SQL Server and PostgreSQL databases
β’ ETL Processes Develop and manage ETL processes to support data warehousing and analytics
β’ Collaboration Work closely with data analysts data scientists and business stakeholders to understand data requirements and deliver solutions
β’ Documentation Maintain comprehensive documentation of data models data flows and ETL processes
β’ Troubleshooting Identify and resolve data related issues and discrepancies
β’ Python Scripting Utilize Python for data manipulation automation and integration tasks
Qualifications / Technical Skills
β’ Experience Minimum of 8 years of experience in data engineering
β’ Proficiency in Snowflake dbt Snaplogic SQL Server PostgreSQL and Azure Data Factory
β’ Strong SQL skills with the ability to write and optimize complex queries
β’ Knowledge of Python for data manipulation and automation
β’ Knowledge of data governance frameworks and best practices
Soft Skills
β’ Excellent problem solving and analytical skills
β’ Strong communication and collaboration skills
β’ Positive attitude and ability to work well in a team environment
β’ Certifications Relevant certifications eg Snowflake Azure are a plus
Title : Snowflake DBT Engineer
Location: New York (Hybrid 2-3 Days)
Type- Contract
Rate- $60/hr to $65/hr
Job Description
We are seeking a highly skilled Data Engineer with extensive experience in Snowflake Data Build Tool dbt Snaplogic SQL Server PostgreSQL Azure Data Factory and other ETL tools The ideal candidate will have a strong ability to optimize SQL queries and a good working knowledge of Python A positive attitude and excellent teamwork skills are essential
Primary Skill - Snowflake Airflow DBT SQL server and PostgreSQL
Key Responsibilities:
β’ Data Pipeline Development Design develop and maintain scalable data pipelines using Snowflake dbt Snaplogic and ETL tools
β’ SQL Optimization Write and optimize complex SQL queries to ensure high performance and efficiency
β’ Data Integration Integrate data from various sources ensuring consistency accuracy and reliability
β’ Database Management Manage and maintain SQL Server and PostgreSQL databases
β’ ETL Processes Develop and manage ETL processes to support data warehousing and analytics
β’ Collaboration Work closely with data analysts data scientists and business stakeholders to understand data requirements and deliver solutions
β’ Documentation Maintain comprehensive documentation of data models data flows and ETL processes
β’ Troubleshooting Identify and resolve data related issues and discrepancies
β’ Python Scripting Utilize Python for data manipulation automation and integration tasks
Qualifications / Technical Skills
β’ Experience Minimum of 8 years of experience in data engineering
β’ Proficiency in Snowflake dbt Snaplogic SQL Server PostgreSQL and Azure Data Factory
β’ Strong SQL skills with the ability to write and optimize complex queries
β’ Knowledge of Python for data manipulation and automation
β’ Knowledge of data governance frameworks and best practices
Soft Skills
β’ Excellent problem solving and analytical skills
β’ Strong communication and collaboration skills
β’ Positive attitude and ability to work well in a team environment
β’ Certifications Relevant certifications eg Snowflake Azure are a plus





