

Prudent Technologies and Consulting, Inc.
Senior Data Engineer / Lead Data Engineer – Snowflake & Dbt _Onsite
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer / Lead Data Engineer with 10+ years of experience, focusing on Snowflake and dbt. It requires advanced SQL skills, expertise in ELT pipeline development, and is located onsite in Phoenix, AZ, or San Antonio/Plano, TX.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 13, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Plano, TX
-
🧠 - Skills detailed
#Data Orchestration #dbt (data build tool) #Airflow #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Vault #Data Engineering #Data Vault #Data Warehouse #Snowflake #Data Pipeline #Scala #Cloud
Role description
Job Title : Senior Data Engineer / Lead Data Engineer – Snowflake & dbt
Location : Phoenix(AZ), San Antonio/Plano (TX)
Technical Skills
• 10+ years of experience in Data Engineering
• Strong hands-on experience with Snowflake cloud data warehouse
• Expertise in dbt (Core or Cloud) for ELT pipeline development
• Advanced proficiency in SQL
• Strong experience with dimensional modeling (Kimball / Data Vault)
Data Pipeline Development
• Experience building scalable ELT/ETL pipelines
• Expertise with incremental loads, snapshots, and transformation strategies
• Experience with data orchestration tools (Airflow, Prefect, Dagster, etc.)
Job Title : Senior Data Engineer / Lead Data Engineer – Snowflake & dbt
Location : Phoenix(AZ), San Antonio/Plano (TX)
Technical Skills
• 10+ years of experience in Data Engineering
• Strong hands-on experience with Snowflake cloud data warehouse
• Expertise in dbt (Core or Cloud) for ELT pipeline development
• Advanced proficiency in SQL
• Strong experience with dimensional modeling (Kimball / Data Vault)
Data Pipeline Development
• Experience building scalable ELT/ETL pipelines
• Expertise with incremental loads, snapshots, and transformation strategies
• Experience with data orchestration tools (Airflow, Prefect, Dagster, etc.)






