Trebecon LLC

Snowflake Architect :: Hybrid Only TX Locals :: W2 Position

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake Architect with 8+ years in data engineering, focusing on Snowflake Data Cloud. It is a hybrid position in Houston, TX, offering a W2 contract. Key skills include SQL, ETL/ELT tools, and cloud platforms.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 7, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Houston, TX
-
🧠 - Skills detailed
#Datasets #Scala #Security #"ETL (Extract #Transform #Load)" #Data Modeling #Matillion #Data Security #AWS (Amazon Web Services) #Data Architecture #BI (Business Intelligence) #SQL (Structured Query Language) #GCP (Google Cloud Platform) #dbt (data build tool) #Informatica #Data Warehouse #Data Migration #Python #Storage #Data Pipeline #Data Storage #Leadership #Visualization #Cloud #Snowflake #Talend #Data Governance #Azure #Migration #Data Engineering
Role description
Snowflake Architect Hybrid--Houston, TX Key Responsibilities • Design and implement end-to-end data architecture using Snowflake. • Develop and optimize data models, data pipelines, and data warehouses. • Architect scalable ETL/ELT solutions for structured and semi-structured data. • Collaborate with data engineers, analysts, and business stakeholders to deliver data solutions. • Implement best practices for data governance, security, and performance optimization in Snowflake. • Integrate Snowflake with cloud platforms such as AWS, Azure, or Google Cloud Platform. • Provide technical leadership and mentor data engineering teams. • Optimize query performance, data storage, and cost management. • Support data migration projects from legacy systems to Snowflake. • Work with BI tools to enable data visualization and analytics. Required Skills & Qualifications • 8+ years of experience in data engineering or data architecture. • Strong hands-on experience with Snowflake Data Cloud. • Expertise in SQL, data modeling, and data warehousing concepts. • Experience with ETL/ELT tools such as Informatica, Talend, Matillion, or dbt. • Knowledge of cloud platforms (AWS, Azure, or Google Cloud Platform). • Experience building data pipelines using Python or similar languages. • Understanding of data security, governance, and performance tuning. • Experience working with large datasets and distributed systems.