Snowflake Data Engineer (with Terraform) 12+ Years of Experience

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake Data Engineer with Terraform expertise, requiring 12+ years of experience. It is a hybrid position in Washington, DC and NYC for 6+ months, focusing on data architecture, CI/CD integration, and cloud security management.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 20, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
Hybrid
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Washington DC-Baltimore Area
-
🧠 - Skills detailed
#Deployment #Cloud #Jenkins #Storage #GIT #Databases #S3 (Amazon Simple Storage Service) #Scala #Azure #GCP (Google Cloud Platform) #GitLab #AWS (Amazon Web Services) #Data Ingestion #Data Engineering #Migration #Terraform #Snowflake #SnowPipe #Vault #Security #GitHub #Data Pipeline
Role description
Snowflake Data Engineer (with Terraform) Location – Washington, DC and NYC Hybrid (3-4 days in a week) Duration - 6+ Month Job Description: Role summary We are seeking a highly skilled Snowflake Data Engineer with strong expertise in Terraform to design, build, and manage scalable data workloads on Snowflake. The ideal candidate will be responsible for automating infrastructure deployment, orchestrating data pipelines, and ensuring efficient and secure data operations in a cloud environment. Key responsibilities β€’ Design and implement Snowflake architecture components using Terraform modules, including accounts, databases, schemas, virtual warehouses, roles, users, grants, stages, pipes, tasks, and streams β€’ Develop reusable, versioned Terraform modules and maintain remote state backend and locking (S3 / Azure / GCS + state locking) β€’ Integrate Terraform workflows into CI/CD pipelines (GitHub, GitLab CI, Jenkins, etc.) to enable automated plan/apply and PR-based change control β€’ Automate deployment of Snowflake TASK objects (scheduled and stream processing) and ensure safe migration strategies for production workloads β€’ Implement security controls using least-privilege RBAC, object-level grants, and secrets management (HashiCorp Vault or cloud secret stores) β€’ Collaborate with data engineering teams to onboard pipelines (Snowpipe, ingestion stages, external tables) and ensure Terraform models match runtime needs β€’ Monitor, tune, and cost-optimize Snowflake compute usage and storage; implement resource monitors and alerting Suggested qualifications (optional) β€’ Strong hands-on experience with Snowflake platform internals and best practices β€’ Proven experience designing and implementing Terraform modules for Snowflake and cloud resources β€’ Familiarity with Git-based CI/CD workflows and automated infrastructure testing β€’ Experience with secrets management solutions (HashiCorp Vault, AWS Secrets Manager, Azure Key Vault, GCP Secret Manager) β€’ Good understanding of data ingestion patterns (Snowpipe, external tables, streaming) and production migration strategies