

TestingXperts
Lead Snowflake Data Engineer/Architect – Houston, TX (Local| F2F Interview Required)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead Snowflake Data Engineer/Architect in Houston, TX, requiring 12+ years of experience. The contract is on-site with a face-to-face interview mandatory. Key skills include Snowflake, ETL/ELT, AWS, and data architecture.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 7, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Houston, TX
-
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Azure #Data Science #Data Security #AWS (Amazon Web Services) #BI (Business Intelligence) #Data Engineering #Data Architecture #ADF (Azure Data Factory) #Azure Data Factory #Scala #Data Analysis #dbt (data build tool) #Documentation #Data Quality #Data Pipeline #Informatica #Snowflake #Cloud #Airflow #Security #GCP (Google Cloud Platform) #Data Warehouse
Role description
Snowflake Data Engineer/ Architect with overall 12+ years of experience.
Address -Houston, TX, 77072 (NEED ONLY LOCAL, AND F2F CLIENT INTERVIEW IS MANDATORY)
Previously worked with any implementation partner
Job responsibilities
Design, develop, and implement end-to-end Snowflake data warehouse solutions.
Build and maintain ETL/ELT data pipelines using tools such as DBT, Airflow, Informatica, or Azure Data Factory.
Develop data models (conceptual, logical, and physical) to support business intelligence and analytics needs.
Optimize Snowflake performance, including query tuning, warehouse sizing, and cost management.
Implement data quality, validation, and transformation processes to ensure accuracy and consistency.
Manage data security, access control, and user permissions within Snowflake.
Collaborate with data analysts, data scientists, and business teams to deliver reliable and scalable data solutions.
Integrate Snowflake with cloud platforms and external data sources (AWS, Azure, GCP).
Maintain comprehensive documentation for data architecture, pipelines, and processes.
Stay current with Snowflake features and emerging data technologies, recommending improvements and best practices.
Snowflake Data Engineer/ Architect with overall 12+ years of experience.
Address -Houston, TX, 77072 (NEED ONLY LOCAL, AND F2F CLIENT INTERVIEW IS MANDATORY)
Previously worked with any implementation partner
Job responsibilities
Design, develop, and implement end-to-end Snowflake data warehouse solutions.
Build and maintain ETL/ELT data pipelines using tools such as DBT, Airflow, Informatica, or Azure Data Factory.
Develop data models (conceptual, logical, and physical) to support business intelligence and analytics needs.
Optimize Snowflake performance, including query tuning, warehouse sizing, and cost management.
Implement data quality, validation, and transformation processes to ensure accuracy and consistency.
Manage data security, access control, and user permissions within Snowflake.
Collaborate with data analysts, data scientists, and business teams to deliver reliable and scalable data solutions.
Integrate Snowflake with cloud platforms and external data sources (AWS, Azure, GCP).
Maintain comprehensive documentation for data architecture, pipelines, and processes.
Stay current with Snowflake features and emerging data technologies, recommending improvements and best practices.






