Sr. Data Architect - SNOWFLAKE

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. Data Architect specializing in Snowflake, requiring 10+ years of data engineering experience. It is a 3-month remote contract with a pay rate of 1.4 lakh per month, focusing on ETL processes and data governance.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 26, 2025
πŸ•’ - Project duration
3 to 6 months
-
🏝️ - Location type
Remote
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Sumner County, TN
-
🧠 - Skills detailed
#Security #Visualization #GCP (Google Cloud Platform) #dbt (data build tool) #Snowflake #AWS (Amazon Web Services) #Compliance #"ETL (Extract #Transform #Load)" #Schema Design #Data Integration #Data Migration #DataOps #Microsoft Power BI #BI (Business Intelligence) #Databases #Cloud #Data Lake #SQL (Structured Query Language) #Azure #Tableau #Python #Data Warehouse #Migration #SAP #Data Pipeline #Scala #Computer Science #Java #Data Governance #Data Architecture #Data Engineering #Data Modeling #Data Security
Role description
Job Role : Sr. Data Architect - SNOWFLAKE Experience: 10+ years Location : 100% Remote Job Type : Contract Contract Duration: 3 months Budget : 1.4 lakh Per Month. Working Time : 12:00 PM - 09:00 PM IST JOB DESCRIPTION β€’ A minimum of 10 years of experience in data engineering, encompassing the development and scaling of data warehouse and data lake platforms. β€’ Working hours - 8 hours , with a few hours of overlap during EST Time zone. This overlap hours is mandatory as meetings happen during this overlap hours. Working hours will be 12Β PM - 9 PM RESPONSIBILITIES Mandatory Skills: Snowflake experience, Data Architecture experience, ETL process experience, Large Data migration solutioning experience β€’ Lead the design and architecture of data solutions leveraging Snowflake, ensuring scalability, performance, and reliability. β€’ Collaborate with stakeholders to understand business requirements and translate them into technical specifications and data models. β€’ Develop and maintain data architecture standards, guidelines, and best practices, including data governance principles and DataOps methodologies. β€’ Oversee the implementation of data pipelines, ETL processes, and data governance frameworks within Snowflake environments. β€’ Provide technical guidance and mentorship to data engineering teams, fostering skill development and knowledge sharing. β€’ Conduct performance tuning and optimization of Snowflake databases and queries. β€’ Stay updated on emerging trends and advancements in Snowflake, cloud data technologies, data governance, and DataOps practices PRIMARY SKILLS: β€’ Extensive experience in designing and implementing data solutions using Snowflake DBT. β€’ Proficiency in data modeling, schema design, and optimization within Snowflake environments. β€’ Strong understanding of cloud data warehousing concepts and best practices, particularly with Snowflake. β€’ Expertise in python/java/scala, SQL, ETL processes, and data integration techniques, with a focus on Snowflake. β€’ Familiarity with other cloud platforms and data technologies (e.g., AWS, Azure, GCP). β€’ Demonstrated experience in implementing data governance frameworks and DataOps practices. β€’ Working experience in SAP environments. β€’ Familiarity with realtime streaming technologies and Change Data Capture (CDC) mechanisms. β€’ Knowledge of data governance principles and DataOps methodologies. β€’ Proven track record of architecting and delivering complex data solutions in cloud platforms/ Snowflake. SECONDARY SKILLS (If Any) β€’ Experience with data visualization tools (e.g., Tableau, Power BI) is a plus. β€’ Knowledge of data security and compliance standards. β€’ Excellent communication and presentation skills, with the ability to convey complex technical concepts to juniors, non-technical stakeholders. β€’ Strong problem-solving and analytical skills - Ability to work effectively in a collaborative team environment and lead cross-functional initiatives. CERTIFICATIONS REQUIRED (If Any) β€’ Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field. β€’ Certifications related Snowflake (e.g., SnowPro core/Snowpro advanced Architect/Snowpro advance Data Engineer ) are desirable but not mandatory.