Jobs via Dice

Data Engineer (Snowflake + DBT) | W2 Contract | Local to TX Only

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (Snowflake + DBT) on a W2 contract, local to TX only, offering competitive pay. Requires 10+ years of Data Engineering experience, 4+ years with Snowflake and DBT, advanced SQL skills, and cloud platform knowledge.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 22, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Dallas, TX
-
🧠 - Skills detailed
#SQL Queries #Data Warehouse #Data Vault #"ETL (Extract #Transform #Load)" #GCP (Google Cloud Platform) #SQL (Structured Query Language) #Scala #Cloud #Data Modeling #Azure #Data Pipeline #Data Engineering #Kafka (Apache Kafka) #Vault #Scrum #AWS (Amazon Web Services) #Schema Design #Scripting #Agile #Python #Data Processing #Documentation #Data Lake #Airflow #Snowflake #Version Control #Data Quality #dbt (data build tool) #GIT
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, JKV International, is seeking the following. Apply via Dice today! Job Description We are looking for a skilled Data Engineer with strong expertise in Snowflake and DBT to design, build, and optimize scalable data pipelines and modern data warehouse solutions. The ideal candidate should have hands-on experience in ELT frameworks, cloud platforms, and data modeling. Key Responsibilities • Design, develop, and maintain scalable data pipelines using Snowflake and DBT • Build and optimize data warehouse solutions, including schema design and performance tuning • Develop modular and reusable data models using DBT (Data Build Tool) • Implement and manage ELT/ETL processes for ingesting data from multiple sources • Write complex SQL queries and ensure data quality, validation, and testing • Collaborate with business and technical teams to gather and translate data requirements • Automate workflows using orchestration tools (Airflow or similar) • Maintain documentation for data pipelines, models, and transformations • Troubleshoot and optimize data pipelines for performance and scalability Required Skills • Strong experience with Snowflake (data warehousing, performance tuning, optimization) • Hands-on experience with DBT (dbt Core / dbt Cloud) • Advanced SQL skills (query optimization, joins, window functions) • Experience with data modeling (Star Schema, Snowflake Schema, Data Vault) • Knowledge of ETL/ELT processes and data pipeline architecture • Experience with Python or scripting for data processing • Familiarity with cloud platforms (AWS / Azure / Google Cloud Platform) • Experience with CI/CD, Git, and version control • Strong understanding of data quality, governance, and testing Preferred Skills • Experience with Airflow / orchestration tools • Knowledge of streaming tools (Kafka, etc.) • Exposure to data lake / lakehouse architecture • Experience in Agile/Scrum environments Experience • 10+ years of Data Engineering experience • 4+ years hands-on with Snowflake + DBT