

VeeAR Projects Inc.
Snowflake Architect
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake Architect with a contract length of "unknown" and a pay rate of "unknown." Required skills include Snowflake expertise, data engineering, ETL tools (Informatica preferred), SQL, and cloud platforms (AWS, Azure). A degree in computer science or engineering and 5+ years of experience are essential.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 18, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Oakland, CA
-
🧠 - Skills detailed
#dbt (data build tool) #Azure #Data Governance #Computer Science #Data Integrity #AWS (Amazon Web Services) #Data Quality #Big Data #Scripting #Storage #Security #Cloud #Database Management #Tableau #Data Orchestration #Informatica #Data Analysis #Version Control #Python #Talend #"ETL (Extract #Transform #Load)" #Data Science #Kafka (Apache Kafka) #SQL (Structured Query Language) #BI (Business Intelligence) #Data Management #Spark (Apache Spark) #Data Storage #GIT #Visualization #Data Engineering #Data Processing #Metadata #Snowflake #Hadoop #Data Pipeline #Microsoft Power BI
Role description
Job Description:
The Data Analytics and Insights team is seeking an experienced and talented Snowflake Architect to join our growing team of analytics experts. As a key member of our team, you will play an essential role in the design, development, and maintenance of data pipelines, data products, and analytic products in enterprise snowflake. The ideal candidate will have a strong background in data engineering, with specific expertise in Informatica and Snowflake. This role will involve working closely with our business stakeholders, data analysts, and data scientists to ensure the efficient development and management of our data infrastructure.
You will have a unique opportunity to be at the forefront of the utility industry and gain a comprehensive view of the nation's most advanced smart grid. It is the perfect role for someone who would like to continue to build upon their professional experience and help advance sustainability goals.
Key Responsibilities:
• Develop and optimize cloud-based data storage and processing solutions using Snowflake.
• Design, implement, and maintain robust data pipelines and ETL processes
• Collaborate with Federated teams and other data engineers to understand data requirements and deliver high-quality data solutions.
• Ensure data integrity and security across all data workflows and storage solutions.
• Monitor and troubleshoot data pipelines, addressing any issues promptly to ensure the smooth flow of data.
• Develop reusable and modular stored procedures and scripts for data processing.
• Contribute to the development and implementation of data governance and best practices.
• Monitor and troubleshoot data pipelines to ensure reliability and accuracy.
• Implement best practices for data governance, data quality, and metadata management.
Minimum Qualifications:
• Bachelor's or master's degree in computer science, Engineering, or a related field.
• Minimum of 5 years of experience in data engineering or a related role.
• Proven experience with Snowflake is required.
• Knowledge of data warehousing concepts, dimensional modeling, and performance tuning.
• Hands-on experience with ETL tools (e.g., Informatica, Talend, dbt, or custom ETL frameworks).
• Strong proficiency in SQL and database management.
• Experience with cloud platforms such as AWS, Azure, or Google Cloud.
• Familiarity with version control (Git) and CI/CD for data pipelines.
• Familiarity with Big Data technologies such as Hadoop, Spark, and Kafka is a plus.
• Excellent problem-solving and analytical skills.
• Strong communication and collaboration abilities.
Preferred Skills:
• Experience with Python or other scripting languages.
• Experience with informatica but not necessary.
• Knowledge of data visualization tools such as Tableau or Power BI.
• Exposure to data orchestration tools
Job Description:
The Data Analytics and Insights team is seeking an experienced and talented Snowflake Architect to join our growing team of analytics experts. As a key member of our team, you will play an essential role in the design, development, and maintenance of data pipelines, data products, and analytic products in enterprise snowflake. The ideal candidate will have a strong background in data engineering, with specific expertise in Informatica and Snowflake. This role will involve working closely with our business stakeholders, data analysts, and data scientists to ensure the efficient development and management of our data infrastructure.
You will have a unique opportunity to be at the forefront of the utility industry and gain a comprehensive view of the nation's most advanced smart grid. It is the perfect role for someone who would like to continue to build upon their professional experience and help advance sustainability goals.
Key Responsibilities:
• Develop and optimize cloud-based data storage and processing solutions using Snowflake.
• Design, implement, and maintain robust data pipelines and ETL processes
• Collaborate with Federated teams and other data engineers to understand data requirements and deliver high-quality data solutions.
• Ensure data integrity and security across all data workflows and storage solutions.
• Monitor and troubleshoot data pipelines, addressing any issues promptly to ensure the smooth flow of data.
• Develop reusable and modular stored procedures and scripts for data processing.
• Contribute to the development and implementation of data governance and best practices.
• Monitor and troubleshoot data pipelines to ensure reliability and accuracy.
• Implement best practices for data governance, data quality, and metadata management.
Minimum Qualifications:
• Bachelor's or master's degree in computer science, Engineering, or a related field.
• Minimum of 5 years of experience in data engineering or a related role.
• Proven experience with Snowflake is required.
• Knowledge of data warehousing concepts, dimensional modeling, and performance tuning.
• Hands-on experience with ETL tools (e.g., Informatica, Talend, dbt, or custom ETL frameworks).
• Strong proficiency in SQL and database management.
• Experience with cloud platforms such as AWS, Azure, or Google Cloud.
• Familiarity with version control (Git) and CI/CD for data pipelines.
• Familiarity with Big Data technologies such as Hadoop, Spark, and Kafka is a plus.
• Excellent problem-solving and analytical skills.
• Strong communication and collaboration abilities.
Preferred Skills:
• Experience with Python or other scripting languages.
• Experience with informatica but not necessary.
• Knowledge of data visualization tools such as Tableau or Power BI.
• Exposure to data orchestration tools






