Snowflake Data Architect (with SQL & DBT Experience)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake Data Architect with SQL & DBT experience, offering a contract length of "unknown" and a pay rate of "unknown." Requires 8+ years in data engineering, strong Snowflake and SQL skills, and financial services experience preferred.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 18, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Houston, TX
-
🧠 - Skills detailed
#Data Engineering #dbt (data build tool) #Leadership #Visualization #Data Modeling #Scala #"ETL (Extract #Transform #Load)" #GCP (Google Cloud Platform) #Tableau #DevOps #Automation #Microsoft Power BI #Compliance #BI (Business Intelligence) #AWS (Amazon Web Services) #Version Control #SQL (Structured Query Language) #Data Governance #Azure #Snowflake #Security #Cloud #Data Architecture #Data Science #Looker #Data Pipeline #Airflow #SQL Queries
Role description
Key Responsibilities β€’ Architect, design, and implement scalable data solutions leveraging Snowflake and DBT. β€’ Develop and optimize SQL queries, data models, and transformation pipelines. β€’ Collaborate with business stakeholders, analysts, and data scientists to translate requirements into efficient data solutions. β€’ Implement best practices for data governance, quality, security, and compliance. β€’ Monitor, troubleshoot, and enhance data pipelines for performance and reliability. β€’ Drive automation and CI/CD practices within the data engineering ecosystem. β€’ Provide technical leadership and mentorship to junior engineers where needed. Required Skills & Experience β€’ 8+ years of experience in data engineering or data architecture roles. β€’ Strong hands-on expertise with Snowflake (data modeling, performance optimization, security, and integration). β€’ Proficiency in DBT for transformation and workflow management. β€’ Advanced knowledge of SQL (query optimization, complex joins, window functions, CTEs). β€’ Experience in data pipeline orchestration and integration tools (e.g., Airflow, Prefect, or similar). β€’ Solid understanding of ETL/ELT design patterns, data warehousing concepts, and modern data architectures. β€’ Familiarity with cloud platforms (AWS, Azure, or Google Cloud Platform). Nice to Have β€’ Previous experience working in the financial services domain (banking, insurance, capital markets, fintech). β€’ Exposure to data visualization or BI tools (Tableau, Power BI, Looker). β€’ Experience with version control, CI/CD, and DevOps practices for data. β€’ Strong problem-solving, analytical, and communication skills.