

Rapid Eagle Inc
Snowflake Architect
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake Architect with a contract length of "unknown," offering a pay rate of $65.00 - $100.00 per hour. Key skills include Snowflake, SQL, ETL processes, and cloud services (AWS, Azure). In-person work required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
800
-
ποΈ - Date
December 5, 2025
π - Duration
Unknown
-
ποΈ - Location
On-site
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
San Jose, CA 95135
-
π§ - Skills detailed
#Java #SQL Queries #Snowflake #Unix #Hadoop #Scala #Oracle #Database Design #Data Lake #Metadata #Microsoft SQL #Big Data #Spark (Apache Spark) #SQL Server #Cloud #Azure #SQL (Structured Query Language) #Programming #Scripting #AWS (Amazon Web Services) #Security #Data Ingestion #Agile #Databases #Bash #Data Access #Microsoft SQL Server #Python #Data Framework #Data Modeling #"ETL (Extract #Transform #Load)" #NoSQL #Web Services #Informatica #Data Warehouse #Data Architecture #MS SQL (Microsoft SQL Server)
Role description
Job Summary We are seeking a dynamic and innovative Snowflake Architect to lead the design, development, and implementation of scalable data solutions using Snowflake Data Cloud. In this role, you will harness your expertise in cloud data architecture to create robust, efficient, and secure data platforms that empower organizations to unlock the full potential of their data assets. Your energetic approach will drive the adoption of best practices in data modeling, ETL processes, and big data analytics, ensuring seamless integration across diverse data ecosystems. Join us to shape the future of enterprise data architecture with cutting-edge technologies and a collaborative spirit!
Duties
Lead the end-to-end architecture design for Snowflake-based data warehouses, ensuring optimal performance, scalability, and security
Collaborate with cross-functional teams to gather requirements and translate business needs into technical solutions leveraging Snowflakeβs features
Develop and implement data models, schemas, and metadata strategies aligned with best practices in database design and data warehousing
Design and oversee ETL/ELT workflows using tools like Informatica, Spark, or custom scripts in Java, Python, or Bash (Unix shell) to ensure efficient data ingestion and transformation
Integrate Snowflake with cloud platforms such as AWS, Azure Data Lake, and Azure to enable hybrid and multi-cloud architectures
Optimize SQL queries and leverage NoSQL databases like NoSQL or linked data approaches for flexible data access patterns
Implement security protocols and governance policies to protect sensitive information within the Snowflake environment
Support agile development methodologies by participating in sprint planning, daily stand-ups, and continuous improvement initiatives
Stay abreast of emerging big data technologies such as Hadoop, Spark, and analytics tools to enhance system capabilities
Requirements
Proven experience designing and deploying large-scale data warehouses on Snowflake Data Cloud or similar platforms
Strong knowledge of cloud services including AWS (Amazon Web Services) and Azure (Microsoftβs cloud platform)
Expertise in data modeling techniques for relational databases like Microsoft SQL Server, Oracle, or Hadoop-based systems
Hands-on experience with ETL/ELT processes using Informatica or custom scripting languages such as Java, Python, or Bash (Unix shell)
Familiarity with NoSQL databases, linked data concepts, and big data frameworks like Spark or Hadoop for advanced analytics applications
Proficiency in SQL programming along with knowledge of database design principles for scalable systems
Experience working within Agile development environments to deliver iterative value rapidly
Strong understanding of security best practices for cloud-based data environments including access controls and encryption strategies
Excellent problem-solving skills combined with a proactive approach to learning new technologies in the evolving landscape of big data analytics
Join us as a Snowflake Architect where your expertise will shape innovative solutions that transform raw data into actionable insights. Your energetic drive will inspire teams to push boundaries while your detailed knowledge ensures every project is executed flawlessly. Together, weβll unlock new possibilities through powerful cloud-based data architectures!
Pay: $65.00 - $100.00 per hour
Expected hours: 40.0 per week
Benefits:
Dental insurance
Health insurance
Paid time off
Work Location: In person
Job Summary We are seeking a dynamic and innovative Snowflake Architect to lead the design, development, and implementation of scalable data solutions using Snowflake Data Cloud. In this role, you will harness your expertise in cloud data architecture to create robust, efficient, and secure data platforms that empower organizations to unlock the full potential of their data assets. Your energetic approach will drive the adoption of best practices in data modeling, ETL processes, and big data analytics, ensuring seamless integration across diverse data ecosystems. Join us to shape the future of enterprise data architecture with cutting-edge technologies and a collaborative spirit!
Duties
Lead the end-to-end architecture design for Snowflake-based data warehouses, ensuring optimal performance, scalability, and security
Collaborate with cross-functional teams to gather requirements and translate business needs into technical solutions leveraging Snowflakeβs features
Develop and implement data models, schemas, and metadata strategies aligned with best practices in database design and data warehousing
Design and oversee ETL/ELT workflows using tools like Informatica, Spark, or custom scripts in Java, Python, or Bash (Unix shell) to ensure efficient data ingestion and transformation
Integrate Snowflake with cloud platforms such as AWS, Azure Data Lake, and Azure to enable hybrid and multi-cloud architectures
Optimize SQL queries and leverage NoSQL databases like NoSQL or linked data approaches for flexible data access patterns
Implement security protocols and governance policies to protect sensitive information within the Snowflake environment
Support agile development methodologies by participating in sprint planning, daily stand-ups, and continuous improvement initiatives
Stay abreast of emerging big data technologies such as Hadoop, Spark, and analytics tools to enhance system capabilities
Requirements
Proven experience designing and deploying large-scale data warehouses on Snowflake Data Cloud or similar platforms
Strong knowledge of cloud services including AWS (Amazon Web Services) and Azure (Microsoftβs cloud platform)
Expertise in data modeling techniques for relational databases like Microsoft SQL Server, Oracle, or Hadoop-based systems
Hands-on experience with ETL/ELT processes using Informatica or custom scripting languages such as Java, Python, or Bash (Unix shell)
Familiarity with NoSQL databases, linked data concepts, and big data frameworks like Spark or Hadoop for advanced analytics applications
Proficiency in SQL programming along with knowledge of database design principles for scalable systems
Experience working within Agile development environments to deliver iterative value rapidly
Strong understanding of security best practices for cloud-based data environments including access controls and encryption strategies
Excellent problem-solving skills combined with a proactive approach to learning new technologies in the evolving landscape of big data analytics
Join us as a Snowflake Architect where your expertise will shape innovative solutions that transform raw data into actionable insights. Your energetic drive will inspire teams to push boundaries while your detailed knowledge ensures every project is executed flawlessly. Together, weβll unlock new possibilities through powerful cloud-based data architectures!
Pay: $65.00 - $100.00 per hour
Expected hours: 40.0 per week
Benefits:
Dental insurance
Health insurance
Paid time off
Work Location: In person






