CloudHive

Snowflake Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake Data Engineer with a contract length of "unknown" and a pay rate of "unknown." Requires 4+ years of Snowflake data engineering experience, proficiency in SQL, and a preferred Snowflake certification. Remote work location.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
December 20, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Deployment #"ETL (Extract #Transform #Load)" #Snowflake #Data Modeling #Version Control #Data Engineering #SQL (Structured Query Language) #BI (Business Intelligence) #Data Pipeline #Data Quality #Documentation #GIT #AI (Artificial Intelligence) #Security #Data Warehouse #Cloud #Microsoft Power BI #DevOps
Role description
CloudHive is your North America and LATAM Snowflake-focused staffing partner. We help accelerate Snowflake adoption and maximize the value of the Snowflake AI Data Cloud by delivering experienced, on-demand Snowflake contractors and specialized talent acquisition services. You'll lead end-to-end data engineering to design, build, and optimize a multi-layer (Bronze/Silver/Gold) data warehouse for analytics and reporting. This role involves gathering requirements, creating production-ready schemas and pipelines, and delivering validated, secure, analytics-ready dataβ€”empowering the client with robust Snowflake solutions. Key Responsibilities: β€’ Gather business requirements and design data models for a multi-layer Snowflake data warehouse. β€’ Build ingestion and ELT pipelines using Snowflake features (e.g., warehouses, Streams & Tasks, stages) and SQL for transformations and data quality checks. β€’ Implement data quality validations, performance tuning, and security measures to ensure reliable, analytics-ready data. β€’ Integrate with tools like DevOps Repos for Git-based version control and Power BI for reporting. β€’ Support production deployments, troubleshoot issues, and provide documentation for seamless handover. β€’ Collaborate with stakeholders to optimize pipelines and align with analytics goals. Requirements: β€’ 4+ years of hands-on experience in Snowflake data engineering, including multi-layer warehouse design and ELT processes. β€’ Proficiency in Snowflake (warehouses, Streams & Tasks, stages), advanced SQL, Git-based version control (e.g., DevOps Repos), and Power BI integration. β€’ Strong skills in data modeling, ingestion, transformations, performance optimization, and data quality implementation. β€’ Experience with production deployments and secure, validated data pipelines. β€’ Snowflake certification (e.g., SnowPro Core) preferred. β€’ Excellent problem-solving and communication skills for cross-functional collaboration. Please get in contact for more information. Strictly no third parties.