DynPro Inc.

Snowflake Developer (Strong SQL Expertise Required)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake Developer with strong SQL expertise, offering a contract length of "unknown" and a pay rate of "unknown." Key skills include data engineering, Snowflake, DBT, Apache Airflow, and Python. Five years of relevant experience is required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
December 2, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#GCP (Google Cloud Platform) #dbt (data build tool) #Scala #Apache Airflow #Data Governance #BI (Business Intelligence) #Azure #Data Engineering #Data Quality #Python #SQL Queries #AWS (Amazon Web Services) #Complex Queries #Data Modeling #Migration #"ETL (Extract #Transform #Load)" #Airflow #Data Analysis #Security #Kafka (Apache Kafka) #Data Pipeline #SQL (Structured Query Language) #Snowflake #Automation #Scripting #Cloud
Role description
Job Description: We are looking for a Snowflake Developer with strong SQL expertise and hands-on experience in building data pipelines, transformations, and analytics models. The ideal candidate should have solid technical depth in data engineering and a proven ability to work with large-scale data systems in Snowflake. Key Responsibilities: • Develop and optimize complex SQL queries for data transformation, analysis, and reporting. • Design, build, and maintain data pipelines and data models in Snowflake. • Implement ETL/ELT processes to integrate structured and semi-structured data. • Collaborate with data analysts and BI teams to improve query performance and scalability. • Create modular, testable transformations using DBT (Data Build Tool). • Manage workflow orchestration using Apache Airflow (or similar tools). • Ensure data quality, performance tuning, and CI/CD implementation for data pipelines. Required Skills & Qualifications: • 5+ years of experience in Data Engineering or related roles. • Strong SQL skills (must be able to write complex queries and optimize performance). • Hands-on experience with Snowflake (data warehousing, query optimization, data sharing). • Experience with DBT for data modeling and transformations. • Experience with Apache Airflow (or similar workflow orchestration tools). • Proficiency in Python for scripting and automation. • Experience working with at least one cloud platform (AWS, Azure, or GCP). • Exposure to streaming technologies like Kafka is a plus. Preferred: • Experience working on data-driven analytics projects rather than migrations. • Knowledge of data governance, security, and cost optimization in Snowflake. Regards, Gaganpreet Singh Lead - Talent Acquisition www.dynpro.com