Artmac

Senior Snowflake Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Snowflake Data Engineer with 8-15 years of experience, on-site in San Jose, California. Key skills include Snowflake architecture, data warehousing, SQL, and data integration tools. A Bachelor's degree is required. Pay rate is W2/C2C.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 4, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
San Jose, CA
-
🧠 - Skills detailed
#Visualization #Programming #Compliance #Python #AWS (Amazon Web Services) #Snowflake #Tableau #Agile #Data Pipeline #Data Engineering #BI (Business Intelligence) #Java #Data Processing #Consulting #SQL (Structured Query Language) #Data Integration #Data Governance #GCP (Google Cloud Platform) #ML (Machine Learning) #"ETL (Extract #Transform #Load)" #Security #Data Integrity #Azure #Data Storage #Microsoft Power BI #Scala #Project Management #Data Modeling #Cloud #Storage #Data Science #Data Architecture
Role description
Who We Are Artmac Soft is a technology consulting and service-oriented IT company dedicated to providing innovative technology solutions and services to customers. Job Description Job Title : Senior Snowflake Data Engineer Job Type : W2/C2C Experience : 8-15 Years Location : San Jose, California (On-Site) Responsibilities • 7-10 years of experience in data engineering. • Proven experience with Snowflake architecture and data warehousing solutions. • Experience with data visualization tools (e.g., Tableau, Power BI). • Experience with Agile methodologies and project management tools • Strong analytical and problem-solving skills. • Strong expertise in Snowflake architecture and data warehousing concepts. • Experience with cloud platforms (AWS, Azure, or GCP) and data integration tools. • Familiarity with programming languages such as Python or Java for data processing. • Knowledge of machine learning concepts and frameworks. • Proficient in SQL and experienced with data modeling and ETL tools. • Solid understanding of data governance, security, and compliance best practices. • Ability to work collaboratively in a team environment and communicate effectively with technical and non-technical stakeholders • Design, develop, and maintain scalable data pipelines and ETL processes using Snowflake. • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver high-quality data solutions. • Implement best practices for data governance, security, and compliance within the Snowflake environment. • Optimize data storage and query performance in Snowflake to ensure efficient data retrieval and processing. • Monitor and troubleshoot data pipeline performance, ensuring data integrity and availability. • Document data architecture, processes, and workflows to facilitate knowledge sharing and onboarding. • Stay updated with the latest trends and advancements in data engineering and Snowflake technologies Qualification • Bachelor's degree or equivalent combination of education and experience.