Gardner Resources Consulting, LLC

DBT SME - Data Modeling, Analytics Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a "DBT SME - Data Modeling, Analytics Engineer" with a long-term contract, remote work, and a pay rate of "W2 or c2c." Requires 10+ years in analytics engineering, expertise in SQL and dbt, and experience with BigQuery.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
December 4, 2025
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
Remote
-
πŸ“„ - Contract
W2 Contractor
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Boston, MA
-
🧠 - Skills detailed
#Python #Version Control #Data Warehouse #Fivetran #Apache Beam #SQL (Structured Query Language) #Mathematics #BigQuery #dbt (data build tool) #Looker #Data Pipeline #GCP (Google Cloud Platform) #Airflow #Redshift #Leadership #"ETL (Extract #Transform #Load)" #Vault #GIT #Snowflake #Docker #Cloud #Data Modeling #Computer Science #AWS (Amazon Web Services) #Data Engineering
Role description
We’re seeking a Lead Analytics Engineer to help design, model, and scale a modern data environment for a global software organization. This role will play a key part in organizing and maturing that landscape as part of a multi-year strategic roadmap. This position is ideal for a senior-level analytics engineer who can architect data solutions, build robust models, and stay hands-on with development. β€’ This is a remote role with occasional onsite meetings. Candidates must currently be local to the Boston area and reside in MA/CT/RI/NH/ME. β€’ Long term contract. W2 or c2c. Highlights: β€’ Architect and build new data models using dbt and modern modeling techniques. β€’ Partner closely with leadership and business teams to translate complex requirements into technical solutions. β€’ Drive structure and clarity within a growing analytics ecosystem. Qualifications β€’ Bachelor’s degree in Economics, Mathematics, Computer Science, or related field. β€’ 10+ years of experience in an Analytics Engineering role. β€’ Expert in SQL and dbt with demonstrated modeling experience. β€’ Data Modeling & Transformation: Design and implement robust, reusable data models within the warehouse. Develop and maintain SQL transformations in dbt. β€’ Data Pipeline & Orchestration: Build and maintain reliable data pipelines in collaboration with data engineering. Utilize orchestration tools (Airflow) to manage and monitor workflows. Manage and support dbt environments and transformations. β€’ Hands-on experience with BigQuery or other cloud data warehouses. β€’ Proficiency in Python and Docker. β€’ Experience with Airflow (Composer), Git, and CI/CD pipelines. β€’ Strong attention to detail and communication skills; able to interact with both technical and business stakeholders. Technical Requirements: β€’ Primary Data Warehouse: BigQuery (mandatory) β€’ Nice to Have: Snowflake, Redshift β€’ Orchestration: Airflow (GCP Composer) β€’ Languages: Expert-level SQL / dbt; strong Python required β€’ Other Tools: GCP or AWS, Fivetran, Apache Beam, Looker or Preset, Docker β€’ Modeling Techniques: Vault 2.0, 3NF, Dimensional Modeling, etc. β€’ Version Control: Git / CI-CD β€’ Quality Tools: dbt-Elementary, dbt-Osmosis, or Great Expectations preferred