MARVEL Infotech Inc.

Snowflake DBT Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake DBT Engineer in Irvine, CA, with a contract length of "unknown" and a pay rate of "unknown." Key skills required include 7 years of data engineering experience, proficiency in SQL and Python, and familiarity with cloud platforms and Infrastructure as Code tools.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
December 12, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Irvine, CA
-
🧠 - Skills detailed
#Terraform #Indexing #GIT #Normalization #Documentation #GCP (Google Cloud Platform) #Snowflake #Airflow #Data Science #Data Quality #Python #"ETL (Extract #Transform #Load)" #Data Ingestion #SQL (Structured Query Language) #Infrastructure as Code (IaC) #Data Architecture #Clustering #Cloud #Security #dbt (data build tool) #Azure #AWS (Amazon Web Services) #Computer Science #Data Engineering
Role description
Snowflake DBT Engineer Location Irvine CA 5 days onsite W2 Only (Visa Independent) "Key Responsibilities • Design develop and maintain ELT pipelines using Snowflake and DBT • Build and optimize data models in Snowflake to support analytics and reporting • Implement modular testable SQL transformations using DBT • Integrate DBT workflows into CICD pipelines and manage infrastructure as code using Terraform • Collaborate with data scientists analysts and business stakeholders to translate requirements into technical solutions • Optimize Snowflake performance through clustering partitioning indexing and materialized views • Automate data ingestion and transformation workflows using Airflow or similar orchestration tools • Ensure data quality governance and security across pipelines • Troubleshoot and resolve performance bottlenecks and data issues • Maintain documentation for data architecture pipelines and operational procedures Required Skills Qualifications • • Bachelors or Masters degree in Computer Science Data Engineering or related field • 7 years of experience in data engineering with at least 2 years focused on Snowflake and DBT • Strong proficiency in SQL and Python • Experience with cloud platforms AWS GCP or Azure • Familiarity with Git CICD and Infrastructure as Code tools Terraform CloudFormation • Knowledge of data modelling star schema normalization and ELT best practices"