Vedan Technologies

Snowflake Data Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake Data Architect in Houston, TX (Hybrid) on a contract basis, offering competitive pay. Key skills include Snowflake architecture, SQL, ELT pipelines with Matillion and Python, and experience with AWS, Azure, or GCP.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 10, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Houston, TX
-
🧠 - Skills detailed
#Data Storage #SQL (Structured Query Language) #Data Pipeline #Snowflake #Python #Documentation #AWS (Amazon Web Services) #Data Engineering #Complex Queries #Data Accuracy #"ETL (Extract #Transform #Load)" #Security #SQL Queries #Data Quality #Data Transformations #dbt (data build tool) #Cloud #GCP (Google Cloud Platform) #Data Modeling #Matillion #Storage #Azure #Scala #Data Extraction #Datasets #Data Architecture
Role description
Job Title: Snowflake Data Architect Location: Houston, TX (Hybrid) Contract Role Job Summary We are seeking an experienced Snowflake Data Architect to design, build, and optimize modern cloud-based data solutions. The ideal candidate will have deep expertise in Snowflake architecture, data modeling, and ELT pipelines, and will work closely with stakeholders to deliver scalable, reliable, and well-documented data platforms for analytics and reporting. Key Responsibilities • Design, implement, and optimize data models in Snowflake to ensure efficient data storage, performance, and retrieval. • Architect and maintain ETL/ELT data pipelines using Matillion and Python, ensuring data accuracy, reliability, and scalability. • Develop and manage data transformations using dbt, creating curated, analytics-ready datasets. • Write and optimize complex SQL queries for data extraction, transformation, and loading processes. • Deploy, manage, and monitor cloud-based data infrastructure across AWS, Azure, or GCP environments. • Ensure data quality, consistency, and governance across all data pipelines and platforms. • Collaborate with business and technical stakeholders to gather requirements and translate them into scalable data solutions. • Create and maintain technical documentation for data architectures, pipelines, and transformation logic. • Apply best practices for performance tuning, cost optimization, and security within Snowflake. Required Skills & Qualifications • Strong hands-on experience with Snowflake data Engineer and data modeling. • Proficiency in SQL, including complex queries and performance tuning. • Experience building ELT pipelines using Matillion and Python. • Strong experience with dbt for data transformation and modeling. • Hands-on experience deploying and managing data platforms on AWS, Azure, or GCP. • Solid understanding of data warehousing concepts and analytics architectures. • Experience working with large-scale, high-volume datasets. • Strong collaboration, communication, and documentation skills.