Onsite DBT Data Architect, Technical Lead, Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Onsite DBT Data Architect, Technical Lead, Senior Data Engineer in Cincinnati, OH, for a contract position. Requires 10+ years in DBT, Snowflake, data architecture, and leadership. Hybrid work, 3-4 days onsite. Pay rate: "unknown".
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
June 6, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Hybrid
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Cincinnati, OH
-
🧠 - Skills detailed
#SQL (Structured Query Language) #AWS (Amazon Web Services) #Libraries #Cloud #GitHub #Programming #Data Warehouse #Scala #PySpark #Python #"ETL (Extract #Transform #Load)" #Apache Airflow #Data Engineering #dbt (data build tool) #Tableau #S3 (Amazon Simple Storage Service) #Docker #Visualization #Microsoft Power BI #Data Architecture #Airflow #Leadership #Data Modeling #Looker #Snowflake #Spark (Apache Spark) #Storage #BI (Business Intelligence) #DevOps #Data Pipeline
Role description
Location: Cincinnati, OH 45213 Type: Contract Title: Onsite DBT Data Architect, Technical Lead, Senior Data Engineer Description: Onsite DBT Data Architect, Technical Lead, Senior Data Engineer with 10+ years in DBT, Snowflake, Design, Architecture, Data Engineering for a contract, possibly to hire position. This position is hybrid onsite 3-4 days in the greater Cincinnati area. Must have expertise in: β€’ DBT β€’ Snowflake β€’ Architecture and Design β€’ Leadership, Mentoring Job Description: β€’ Working on cloud data warehouses and data pipelines β€’ Working heavily with dbt to transform data, building, managing, testing and maintaining data pipelines β€’ Design and build data pipelines using Snowflake β€’ Doing design and architecture β€’ Data Modeling and Designing Architecture diagrams β€’ Data engineering, building scalable, sustainable β€’ Secure data platforms powering intelligent applications β€’ SQL query and stored procedure development β€’ SQL optimizing performance β€’ Python programming β€’ GitHub and DAG construction β€’ CI/CD principles, DevOps practices, software testing β€’ Work with AWS storage technologies, S3 β€’ Work with Orchestration tools - Apache Airflow, Astronomer β€’ Design and development of data visualizations and report using Looker, Tableau, Power BI, or custom visualization libraries What you will need: β€’ 10+ years in DBT, Data Architecture and Design, Data Engineering, Cloud Data Warehouse β€’ Expert level DBT β€’ Strong Design and Architecture β€’ Advanced / expert level Snowflake β€’ Leadership, mentoring teams onshore and offshore Other skills: β€’ Cloud-based data engineering, AWS β€’ Orchestration using Apache Airflow and Astronomer β€’ Python β€’ Five Tran β€’ PySpark β€’ Looker β€’ Advanced SQL Good to have: β€’ Data Visualization - Tableau, Power BI β€’ Docker