Galent

Snowflake DBT Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake DBT Data Engineer with a contract length of "unknown," offering a pay rate of "unknown." Candidates should have 12+ years of experience, proficiency in Snowflake, ANSI-SQL, DBT, and relevant cloud platforms.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
January 13, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Irvine, CA
-
🧠 - Skills detailed
#Data Architecture #Azure #Clustering #Data Science #SQL (Structured Query Language) #GCP (Google Cloud Platform) #Infrastructure as Code (IaC) #Data Ingestion #Cloud #Snowflake #Indexing #Python #dbt (data build tool) #AWS (Amazon Web Services) #Computer Science #Documentation #Terraform #Airflow #GIT #AI (Artificial Intelligence) #Normalization #Data Quality #Security #"ETL (Extract #Transform #Load)" #Data Engineering
Role description
We’re hiring a Snowflake DBT Data Engineer Join Galent and help us deliver high-impact technology solutions that shape the future of digital transformation Experience : 12+ years Mandatory Skills : Snowflake, ANSI-SQL,DBT Key Responsibilities: β€’ Design develop and maintain ELT pipelines using Snowflake and DBT β€’ Build and optimize data models in Snowflake to support analytics and reporting β€’ Implement modular testable SQL transformations using DBT β€’ Integrate DBT workflows into CICD pipelines and manage infrastructure as code using Terraform β€’ Collaborate with data scientists analysts and business stakeholders to translate requirements into technical solutions β€’ Optimize Snowflake performance through clustering partitioning indexing and materialized views β€’ Automate data ingestion and transformation workflows using Airflow or similar orchestration tools β€’ Ensure data quality governance and security across pipelines β€’ Troubleshoot and resolve performance bottlenecks and data issues β€’ Maintain documentation for data architecture pipelines and operational procedures Required Skills Qualifications: β€’ Bachelors or Masters degree in Computer Science Data Engineering or related field β€’ 7 years of experience in data engineering with at least 2 years focused on Snowflake and DBT β€’ Strong proficiency in SQL and Python β€’ Experience with cloud platforms AWS GCP or Azure β€’ Familiarity with Git CICD and Infrastructure as Code tools Terraform CloudFormation β€’ Knowledge of data modelling star schema normalization and ELT best practices Why Galent Galent is a digital engineering firm that brings AI-driven innovation to enterprise IT. We’re proud of our diverse and inclusive team culture where bold ideas drive transformation. Ready to Apply? Send your resume to Satish.k@galent.com