Mastech Digital

Sr DBT Analytics Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr DBT Analytics Engineer with a contract length of 6 months+, offering remote work in the USA. Key skills required include 5+ years in analytics engineering, expert SQL, extensive dbt experience, and strong Python programming.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
April 21, 2026
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
Remote
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Monitoring #Documentation #Data Governance #Cloud #Redshift #GDPR (General Data Protection Regulation) #Python #GIT #Programming #Data Modeling #"ETL (Extract #Transform #Load)" #Data Warehouse #Compliance #Tableau #Data Mart #Data Engineering #dbt (data build tool) #Databricks #Datasets #Data Science #Snowflake #BigQuery #Version Control #SQL (Structured Query Language) #Data Quality #BI (Business Intelligence) #Looker #Scala #Airflow
Role description
Title: DBT Engineer Duration: 6 Months + Location: Remote/USA Job Description: Job Description: What You’ll Do Design and implement robust dimensional and relational data models. Build and maintain scalable dbt transformation pipelines. Own transformation and modeling of curated (Silver/Gold) datasets. Collaborate with analysts, product analytics, data scientists, and stakeholders. Implement data quality tests, monitoring, SLAs, and alerting. Partner with Data Engineers to define and enforce data contracts. Follow analytics engineering best practices (version control, testing, documentation). Empower self-service analytics with intuitive, well-documented data marts. What We’re Looking For 5+ years in analytics engineering, data modeling, or similar roles. Expert-level SQL skills (optimization & performance tuning). Extensive dbt experience (testing, documentation, package management). Strong Python programming skills. Deep understanding of dimensional modeling (star schemas, one big table). Experience with cloud data warehouses (Snowflake, BigQuery, Redshift, Databricks). Familiarity with Airflow and orchestration frameworks. Experience with Git and CI/CD for analytics code. Strong business acumen and ability to translate requirements into data models. Knowledge of data governance, privacy, compliance (GDPR, CCPA, SOX). Familiarity with BI tools (Tableau, Looker, Mode). Strong ownership mindset and communication skills. Dedication to data quality, documentation, and enabling self-service analytics. Bachelor’s Degree in a technical field, or equivalent experience