Snowflake Engineer with DBT

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake Engineer with dbt expertise, offering a contract length of "unknown" and a pay rate of "unknown." Key skills required include Snowflake, dbt, SQL, and ETL/ELT pipeline experience. Cloud platform familiarity is preferred.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 14, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Tableau #Compliance #Airflow #Security #Macros #Looker #SQL (Structured Query Language) #Data Governance #GCP (Google Cloud Platform) #Data Engineering #Cloud #"ETL (Extract #Transform #Load)" #Documentation #Snowflake #Azure #GIT #BI (Business Intelligence) #Data Modeling #Python #Data Quality #Scala #AWS (Amazon Web Services) #Microsoft Power BI #Data Pipeline #dbt (data build tool) #Version Control
Role description
Job Title: Snowflake Engineer with dbt Overview: We are seeking a Snowflake Engineer with strong expertise in dbt to design, build, and maintain scalable data pipelines and models. The role will focus on enabling clean, reliable, and performant analytics within our data platform. Responsibilities: β€’ Design, develop, and optimize data models in Snowflake using dbt. β€’ Build and maintain ETL/ELT pipelines to support analytics and reporting needs. β€’ Implement data quality, testing, and documentation standards using dbt features. β€’ Work with business and analytics teams to translate requirements into scalable data solutions. β€’ Monitor and optimize Snowflake performance, including query tuning and resource management. β€’ Support data governance, security, and compliance practices. β€’ Collaborate with engineers, analysts, and stakeholders to ensure reliable data delivery. Qualifications: β€’ Strong hands-on experience with Snowflake (warehousing, performance tuning, security). β€’ Proficiency in dbt (data modeling, macros, testing, documentation). β€’ Solid SQL development skills. β€’ Experience with ELT/ETL pipelines and orchestration tools (Airflow, Dagster, Prefect, etc.). β€’ Familiarity with version control (Git) and CI/CD practices. β€’ Understanding of data governance, lineage, and best practices. β€’ Strong problem-solving and communication skills. Preferred: β€’ Experience with cloud platforms (AWS, Azure, or GCP). β€’ Exposure to BI tools such as Power BI, Tableau, or Looker. β€’ Knowledge of Python for data engineering workflows.