

Ampstek
Snowflake Data Engineer (Client Preferred USC or GC)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Snowflake Data Engineer position based in New York, offering a long-term contract with a focus on Snowflake, SQL, and data modeling. Candidates should have hands-on experience with Snowflake, DBT, and cloud platforms.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 7, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
New York, United States
-
🧠 - Skills detailed
#Data Quality #dbt (data build tool) #Cloud #Snowflake #GCP (Google Cloud Platform) #Compliance #Data Science #Data Architecture #Airflow #Data Security #SQL (Structured Query Language) #Version Control #AWS (Amazon Web Services) #"ETL (Extract #Transform #Load)" #Data Analysis #Scala #Data Modeling #GIT #Data Governance #Data Ingestion #Security #Data Engineering #BI (Business Intelligence) #Azure #Data Pipeline
Role description
Role: Snowflake Data Engineer (Client Preferred USC or GC)
Location: New York, New York, United States(Onsite)
Job Type: Long Term Contract
Job Description:
Primary Skill : Snowflake, SQL, Data Modeling
The Data Engineer is responsible for building and maintaining scalable data pipelines and solutions using Snowflake and DBT (Data Build Tool). This role focuses on enabling analytics and business intelligence by ensuring data quality, performance, and governance.
Key Responsibilities:
• Data Pipeline Development: o Design and implement ETL/ELT pipelines using DBT and Snowflake. o Automate workflows for data ingestion, transformation, and loading.
• Data Modeling: o Develop efficient data models optimized for analytics and reporting. o Ensure alignment with enterprise data architecture and standards.
• Performance Optimization: o Optimize queries and transformations for Snowflake performance. o Implement best practices for cost management and scalability.
• Collaboration: o Work closely with data analysts, data scientists, and business teams. o Support BI and reporting initiatives by providing clean, structured data.
• Quality & Governance: o Implement data validation and testing within DBT. o Ensure compliance with data security and governance policies.
Required Skills & Qualifications:
• Hands-on experience with Snowflake and DBT.
• Strong SQL skills and understanding of data modeling concepts.
• Familiarity with cloud platforms (AWS, Azure, GCP).
• Knowledge of CI/CD pipelines and version control (Git).
• Experience with orchestration tools (Airflow, Prefect) is a plus.
• Understanding of data governance and compliance standards.
Thanks
Rakesh Pathak | Lead Recruiter
Phone: 609-360-2642
Rakesh.pathak@ampstek.com| www.ampstek.com
https://www.linkedin.com/in/rakesh-kumar-pathak-00b039167/
Role: Snowflake Data Engineer (Client Preferred USC or GC)
Location: New York, New York, United States(Onsite)
Job Type: Long Term Contract
Job Description:
Primary Skill : Snowflake, SQL, Data Modeling
The Data Engineer is responsible for building and maintaining scalable data pipelines and solutions using Snowflake and DBT (Data Build Tool). This role focuses on enabling analytics and business intelligence by ensuring data quality, performance, and governance.
Key Responsibilities:
• Data Pipeline Development: o Design and implement ETL/ELT pipelines using DBT and Snowflake. o Automate workflows for data ingestion, transformation, and loading.
• Data Modeling: o Develop efficient data models optimized for analytics and reporting. o Ensure alignment with enterprise data architecture and standards.
• Performance Optimization: o Optimize queries and transformations for Snowflake performance. o Implement best practices for cost management and scalability.
• Collaboration: o Work closely with data analysts, data scientists, and business teams. o Support BI and reporting initiatives by providing clean, structured data.
• Quality & Governance: o Implement data validation and testing within DBT. o Ensure compliance with data security and governance policies.
Required Skills & Qualifications:
• Hands-on experience with Snowflake and DBT.
• Strong SQL skills and understanding of data modeling concepts.
• Familiarity with cloud platforms (AWS, Azure, GCP).
• Knowledge of CI/CD pipelines and version control (Git).
• Experience with orchestration tools (Airflow, Prefect) is a plus.
• Understanding of data governance and compliance standards.
Thanks
Rakesh Pathak | Lead Recruiter
Phone: 609-360-2642
Rakesh.pathak@ampstek.com| www.ampstek.com
https://www.linkedin.com/in/rakesh-kumar-pathak-00b039167/






