

Gardner Resources Consulting, LLC
Lead DBT Data Modeling Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead DBT Data Modeling Engineer with a contract length of "unknown" and a pay rate of "unknown." Key skills include SQL, dbt, BigQuery, Python, and experience in analytics engineering. A bachelor's degree and 10+ years of relevant experience are required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
February 18, 2026
π - Duration
Unknown
-
ποΈ - Location
Unknown
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Boston, MA
-
π§ - Skills detailed
#Redshift #SQL (Structured Query Language) #Scala #Looker #Data Warehouse #Data Modeling #"ETL (Extract #Transform #Load)" #Vault #Airflow #BigQuery #AWS (Amazon Web Services) #Mathematics #Apache Beam #Version Control #Cloud #Snowflake #GIT #Docker #Fivetran #GCP (Google Cloud Platform) #Computer Science #dbt (data build tool) #Python
Role description
DBT SME - Weβre seeking a Lead Analytics Engineer to help design, model, and scale a modern data environment for a global software organization. The company manages large volumes of data across multiple business units, and this role will play a key part in organizing and maturing that landscape as part of a multi-year strategic roadmap.
Business Stakeholder Engagement
β’ Gather and document complex business requirements.
β’ Translate business needs into scalable, maintainable data products.
β’ Serve as a trusted data partner across multiple departments.
Data Modeling & Transformation
β’ Design and implement robust, reusable data models within the warehouse.
β’ Develop and maintain SQL transformations in dbt.
β’ Optimize existing models and queries for performance, cost-efficiency, and maintainability.
Qualifications
β’ Bachelorβs degree in Economics, Mathematics, Computer Science, or related field.
β’ 10+ years of experience in an Analytics Engineering role.
β’ Expert in SQL and dbt with demonstrated modeling experience (Vault, 3NF, Dimensional).
β’ Hands-on experience with BigQuery or other cloud data warehouses.
β’ Proficiency in Python and Docker.
β’ Experience with Airflow (Composer), Git, and CI/CD pipelines.
β’ Strong attention to detail and communication skills; able to interact with both technical and business stakeholders.
Technical Environment
β’ Primary Data Warehouse: BigQuery
β’ Nice to Have: Snowflake, Redshift
β’ Orchestration: Airflow (GCP Composer)
β’ Languages: Expert-level SQL / dbt; strong Python required
β’ Other Tools: GCP or AWS, Fivetran, Apache Beam, Looker or Preset, Docker
β’ Modeling Techniques: Vault 2.0, 3NF, Dimensional Modeling, etc.
β’ Version Control: Git / CI-CD
DBT SME - Weβre seeking a Lead Analytics Engineer to help design, model, and scale a modern data environment for a global software organization. The company manages large volumes of data across multiple business units, and this role will play a key part in organizing and maturing that landscape as part of a multi-year strategic roadmap.
Business Stakeholder Engagement
β’ Gather and document complex business requirements.
β’ Translate business needs into scalable, maintainable data products.
β’ Serve as a trusted data partner across multiple departments.
Data Modeling & Transformation
β’ Design and implement robust, reusable data models within the warehouse.
β’ Develop and maintain SQL transformations in dbt.
β’ Optimize existing models and queries for performance, cost-efficiency, and maintainability.
Qualifications
β’ Bachelorβs degree in Economics, Mathematics, Computer Science, or related field.
β’ 10+ years of experience in an Analytics Engineering role.
β’ Expert in SQL and dbt with demonstrated modeling experience (Vault, 3NF, Dimensional).
β’ Hands-on experience with BigQuery or other cloud data warehouses.
β’ Proficiency in Python and Docker.
β’ Experience with Airflow (Composer), Git, and CI/CD pipelines.
β’ Strong attention to detail and communication skills; able to interact with both technical and business stakeholders.
Technical Environment
β’ Primary Data Warehouse: BigQuery
β’ Nice to Have: Snowflake, Redshift
β’ Orchestration: Airflow (GCP Composer)
β’ Languages: Expert-level SQL / dbt; strong Python required
β’ Other Tools: GCP or AWS, Fivetran, Apache Beam, Looker or Preset, Docker
β’ Modeling Techniques: Vault 2.0, 3NF, Dimensional Modeling, etc.
β’ Version Control: Git / CI-CD






