

Snowflake Architect
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake Architect on a long-term remote contract, offering competitive pay. Candidates should have 8+ years in Data Engineering/Architecture, expertise in Snowflake, and experience with ETL/ELT pipelines and data migration. Snowflake Architect Certification preferred.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 29, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Remote
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Delta Lake #Data Pipeline #Strategy #Data Quality #Migration #Clustering #Data Mart #Automation #Teradata #Matillion #Data Lake #BI (Business Intelligence) #dbt (data build tool) #Airflow #Python #Storage #Data Governance #Data Architecture #Data Migration #DataOps #Redshift #Scala #SnowPipe #Data Profiling #Security #Databricks #Oracle #Microsoft Power BI #Talend #"ETL (Extract #Transform #Load)" #Data Engineering #Cloud #BigQuery #SQL Server #Data Strategy #Batch #Documentation #Looker #Data Lakehouse #Data Warehouse #Data Integrity #Tableau #Snowflake #SQL (Structured Query Language)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Role : Snowflake Architect
Location : Remote
Duration : Long Term Contract
Job Summary:
Experience Required: 8+ years in Data Engineering / Architecture
Job Summary:
We are seeking an experienced Snowflake Architect to lead the design, development, and optimization of data architecture solutions using Snowflake. The ideal candidate will have deep expertise in Snowflake Architecture and experience re-architecting legacy systems for cloud data platforms. You will also be responsible for building scalable ETL/ELT pipelines and ensuring seamless data migration from various sources to Snowflake.
Key Responsibilities:
1. Snowflake Architecture & Design
β’ Design and implement Snowflake architecture for scalable and secure data solutions.
β’ Define and maintain the overall Snowflake platform strategy, including best practices for development, optimization, security, and cost management.
β’ Lead Snowflake performance tuning, clustering, partitioning, and workload management.
1. Architectural Frameworks
β’ Develop and apply data architecture frameworks and patterns to guide solution design and delivery.
β’ Align architectural decisions with enterprise data strategy and business goals.
β’ Create documentation for architectural decisions, technical standards, and data governance.
1. Snowflake Design & Modelling
β’ Design data models (Dimensional, Star Schema, Snowflake Schema) tailored to business requirements.
β’ Implement role-based access control (RBAC) and data masking policies in Snowflake.
β’ Optimize storage, compute, and query performance using caching, materialized views, and clustering.
1. Data Migration
β’ Lead end-to-end migration efforts from traditional data warehouses (e.g., Teradata, Oracle, SQL Server) or cloud platforms (e.g., Redshift, BigQuery) to Snowflake.
β’ Plan and execute data validation, data profiling, and transformation logic as part of migration.
β’ Automate data load processes and data quality checks post-migration.
1. Medallion / Delta Lake Pattern Implementation is Plus
β’ Architect and implement Medallion architecture (Bronze, Silver, Gold layers) for data lakehouses using Snowflake or Delta Lake concepts.
β’ Ensure data integrity, quality, and lineage across ingestion, transformation, and presentation layers.
β’ Design streaming and batch ingestion pipelines that follow DataOps and Lakehouse principles.
1. Snowflake Re-Architecture
β’ Evaluate current Snowflake implementations and propose re-architecture strategies to address scalability, cost, and performance bottlenecks.
β’ Modernize legacy data pipelines, consolidate data marts, and refactor data models.
β’ Lead re-platforming efforts with a focus on automation, modularity, and reusability.
1. ETL / ELT Data Pipelines
β’ Design, build, and maintain ETL/ELT pipelines using tools like DBT, Talend, Matillion, Airflow, or native Snowflake features (Streams, Tasks, Snowpipe).
β’ Integrate data from multiple sources including APIs, flat files, and streaming platforms.
β’ Monitor, troubleshoot, and optimize pipeline performance and data latency.
Preferred Qualifications:
β’ Snowflake Architect Certification.
β’ Experience with Delta Lake or Databricks ecosystem.
β’ Knowledge of Python or Scala for data engineering tasks.
β’ Familiarity with BI tools like Power BI, Tableau, or Looker.