DynPro Inc.

Snowflake Developer (Strong SQL Expertise Required)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake Developer with strong SQL expertise, offering a contract length of "unknown" and a pay rate of "unknown." Required skills include 5+ years in Data Engineering, hands-on Snowflake experience, and proficiency in Python and DBT.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
November 13, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Data Governance #Cloud #Scripting #Data Engineering #Data Modeling #Scala #Data Quality #Apache Airflow #Snowflake #Airflow #GCP (Google Cloud Platform) #dbt (data build tool) #SQL (Structured Query Language) #AWS (Amazon Web Services) #Azure #SQL Queries #Data Analysis #"ETL (Extract #Transform #Load)" #Python #Data Pipeline #Security #Migration #BI (Business Intelligence) #Automation #Kafka (Apache Kafka) #Complex Queries
Role description
Job Description: We are looking for a Snowflake Developer with strong SQL expertise and hands-on experience in building data pipelines, transformations, and analytics models. The ideal candidate should have solid technical depth in data engineering and a proven ability to work with large-scale data systems in Snowflake. Key Responsibilities: β€’ Develop and optimize complex SQL queries for data transformation, analysis, and reporting. β€’ Design, build, and maintain data pipelines and data models in Snowflake. β€’ Implement ETL/ELT processes to integrate structured and semi-structured data. β€’ Collaborate with data analysts and BI teams to improve query performance and scalability. β€’ Create modular, testable transformations using DBT (Data Build Tool). β€’ Manage workflow orchestration using Apache Airflow (or similar tools). β€’ Ensure data quality, performance tuning, and CI/CD implementation for data pipelines. Required Skills & Qualifications: β€’ 5+ years of experience in Data Engineering or related roles. β€’ Strong SQL skills (must be able to write complex queries and optimize performance). β€’ Hands-on experience with Snowflake (data warehousing, query optimization, data sharing). β€’ Experience with DBT for data modeling and transformations. β€’ Experience with Apache Airflow (or similar workflow orchestration tools). β€’ Proficiency in Python for scripting and automation. β€’ Experience working with at least one cloud platform (AWS, Azure, or GCP). β€’ Exposure to streaming technologies like Kafka is a plus. Preferred: β€’ Experience working on data-driven analytics projects rather than migrations. β€’ Knowledge of data governance, security, and cost optimization in Snowflake. Regards, Gaganpreet Singh Lead - Talent Acquisition www.dynpro.com