Data Engineer (Snowflake)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (Snowflake) in Sunnyvale, CA, with a contract length of 6–12 months at $100/hour. Key skills include Snowflake architecture, CI/CD with GitHub/Docker/Kubernetes, and Python/Airflow for data pipelines.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
800
-
🗓️ - Date discovered
August 27, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Hybrid
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Sunnyvale, CA
-
🧠 - Skills detailed
#Python #SQL (Structured Query Language) #Docker #"ETL (Extract #Transform #Load)" #Scala #Fivetran #Data Engineering #Airflow #Data Pipeline #Programming #Data Warehouse #Monitoring #Data Modeling #Data Ingestion #Deployment #Snowflake #GitHub #Kubernetes
Role description
Job Title: Senior Data Engineer Location: Sunnyvale, CA (Hybrid) — Remote considered for exceptional candidates Pay Rate: $100/hour Contract Length: 6–12 Months + Extensions Top 3 Skills • Snowflake Expertise (Architecture + Administration) • The core of this role is managing and orchestrating the Snowflake Data Warehouse, including setting up RBAC (role-based access control), warehouses, monitors, and performance tuning. • CI/CD and Containerization (GitHub, Docker, Kubernetes) • A major component of this role is building and maintaining CI/CD pipelines for deploying data pipelines into dev and production environments using GitHub, and orchestrating them in Kubernetes and Docker. • Python + Airflow for Data Pipeline Development • The engineer will be expected to write ingestion scripts and orchestration flows using Airflow, along with building validation frameworks—all of which will be done using Python. Unleash the Power of Data – Join a High-Impact Engineering Team Are you ready to work at the intersection of data, innovation, and next-gen infrastructure? We’re seeking a Data Engineer who thrives on building elegant data pipelines, managing scalable warehouse systems, and implementing cutting-edge CI/CD workflows. This is your chance to play a pivotal role in shaping a modern data stack with Snowflake, Airflow, and containerized deployment strategies. If you're passionate about building reliable systems, solving complex problems, and love working with forward-thinking engineers—this is the role for you. What You’ll Do • Orchestrate and maintain data pipelines and workflows within a Snowflake Data Warehouse • Ensure existing pipelines meet SLAs and operate smoothly with CI/CD best practices • Configure and manage RBAC roles, users, monitors, and warehouse performance • Develop and deploy CI/CD pipelines using GitHub, Kubernetes, and Docker • Write and maintain data ingestion scripts with orchestration via Airflow • Build robust data validation frameworks to ensure consistency and accuracy across layers What You Bring • Hands-on Snowflake experience, including role-based access controls, warehouses, and monitoring • Strong programming skills in Python and advanced SQL proficiency • Experience building CI/CD pipelines and deploying into dev/prod using GitHub, Docker, and Kubernetes • Solid understanding of ELT/ETL processes and data modeling best practices • Familiarity with setting up UDFs and Stored Procedures in Snowflake • Experience with ingestion tools like Fivetran, HVR, or Airbyte • Excellent problem-solving and analytical skills • Self-starter attitude with the ability to work independently or as part of a cross-functional team