Snowflake DBT Lead

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake DBT Lead in Dallas, TX, with a contract length of "unknown" and a pay rate of "unknown." Candidates should have 3-6 years of Snowflake experience, 2 years with DBT, and knowledge of SAP data sources.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 10, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
On-site
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Dallas, TX
-
🧠 - Skills detailed
#Clustering #Compliance #Airflow #Security #ADF (Azure Data Factory) #Metadata #Normalization #Snowflake #AWS S3 (Amazon Simple Storage Service) #AWS (Amazon Web Services) #Scrum #Snowpark #SnowSQL #"ETL (Extract #Transform #Load)" #Data Catalog #Data Quality #Data Governance #Data Warehouse #SAP #Cloud #S3 (Amazon Simple Storage Service) #Storage #Azure #Macros #Version Control #Azure Data Factory #Documentation #SQL (Structured Query Language) #BitBucket #Databases #Data Analysis #Jenkins #dbt (data build tool) #GIT #GCP (Google Cloud Platform) #Agile #Data Security #YAML (YAML Ain't Markup Language) #Scala #Data Engineering #SnowPipe
Role description
Job description Snowflake DBT Lead Dallas TX Onsite Role Overview We are looking for a highly skilled Sr Data Engineer with deep expertise in Snowflake and DBT to join our data engineering team The ideal candidate will be responsible for designing building and optimizing scalable data transformation pipelines and ensuring data quality and performance across our analytics platforms Key Responsibilities Technical oDesign and implement modular reusable DBT models for data transformation in Snowflake oOptimize Snowflake performance through clustering partitioning caching and query tuning oDefine and manage schema objects including databases schemas tables views and stages oBuild and maintain ELT pipelines using Snowflake native features like Snowpipe Streams and Tasks oIntegrate Snowflake with external data sources and cloud storage eg AWS S3 Azure Blob GCP oOptimize query performance using clustering keys result caching and materialized views oMonitor and tune warehouse performance and cost efficiency oLeverage advanced Snowflake features like Time Travel ZeroCopy Cloning and Data Sharing oExplore and implement UDFs external functions and Snowpark where applicable oEnsure compliance with data governance and privacy standards oAutomate workflows using orchestration tools eg Airflow Azure Data Factory oSchedule and monitor data jobs using Snowflake Tasks and external schedulers oCollaborate with data analysts architects and business stakeholders to translate requirements into scalable data solutions oDesign and implement DBT projects from scratch including folder structure model layers staging intermediate marts and naming conventions oUse Git for version control of DBT projects oDesign build and maintain modular DBT models for data transformation oImplement staging intermediate and mart layers following best practices oUse Jinja templating and macros to create reusable logic oDefine and manage tests eg uniqueness not null accepted values within DBT oMonitor test results and resolve data quality issues proactively oImplement CICD pipelines for DBT projects using Git Bitbucket and Jenkins oEnsure data governance lineage and documentation using tools like dbtdocs and metadata tagging oIntegrate Snowflake with cloud storage eg GCP Azure Blob AWS S3 and orchestration tools eg Airflow Azure Data Factory oTroubleshoot and resolve data quality issues and performance bottlenecks oImplement rolebased access controls and data masking where required oEnsure compliance with data governance and privacy policies oIntegrate DBT with orchestration tools eg Airflow Prefect oSchedule and monitor DBT run in production environments Functional oPrior experience on working with sources like SAP ECC S4 HANA oFunctional understanding one of these SAP module Supply chain Finance FICO Sales Distribution oPrior experience pulling data from SAP sources Required Skills 36 years of handson experience with Snowflake including SnowSQL Snowpipe Streams Tasks and Time Travel 2 years of handson experience with DBT Core or Cloud in a production environment Strong SQL skills and experience with data modelling starsnowflake schema normalizationdenormalization Deep understanding of DBT features materializations table view incremental ephemeral macros seeds snapshots tests and documentation Experience with cloud data warehouses Snowflake Proficiency in Git CICD and workflow orchestration eg Airflow dbt Cloud Familiarity with Jinja templating YAML configuration and DBT project structure Strong communication skills and ability to work cross functionally Preferred Qualifications SnowPro Core Certification or equivalent Experience with Airflow Azure Data Factory or similar orchestration tools Familiarity with data cataloging and lineage tools Knowledge of data security RBAC and masking in Snowflake Experience working in AgileScrum environments