Cygnus Professionals Inc.

Snowflake Data Engineer - W2 Only, No C2C - 8+YEARS

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake Data Engineer with 8+ years of experience, focusing on Snowflake Data Warehouse design. Contract length is unspecified, and pay is competitive. Skills required include SQL, Python, cloud platforms (AWS, Azure, GCP), and ELT pipeline development.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
440
-
πŸ—“οΈ - Date
December 24, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
Corp-to-Corp (C2C)
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Indianapolis, IN
-
🧠 - Skills detailed
#dbt (data build tool) #R #Data Strategy #Matillion #ML (Machine Learning) #Strategy #Data Modeling #GCP (Google Cloud Platform) #Security #Cloud #Data Analysis #Automated Testing #Data Security #Fivetran #Clustering #Data Engineering #SAP #Python #"ETL (Extract #Transform #Load)" #SnowPipe #Data Warehouse #JSON (JavaScript Object Notation) #Airflow #Data Management #Azure #Data Science #Data Pipeline #Data Quality #Metadata #Snowflake #Compliance #IoT (Internet of Things) #BI (Business Intelligence) #Scala #Data Transformations #Version Control #Data Governance #AWS (Amazon Web Services) #AI (Artificial Intelligence) #GIT #CRM (Customer Relationship Management) #SQL (Structured Query Language) #Computer Science
Role description
We are seeking a Snowflake Data Engineer to design, build, and optimize scalable data solutions on the Snowflake Data Cloud. This role will support analytics, reporting, and AI/ML initiatives across commercial, manufacturing, R&D, and quality systems. The ideal candidate has strong expertise in cloud data engineering, ELT pipelines, and enterprise-grade data platforms within regulated environments. Required Qualifications β€’ Bachelor’s degree in computer science, Engineering, Data Science, or related field β€’ 8+ years of hands-on experience in data engineering with strong focus on Snowflake Data Warehouse design and management β€’ Extensive experience designing, building, and managing enterprise-scale Snowflake data warehouses β€’ Strong hands-on experience with Snowflake (SQL, Virtual Warehouses, Snowpipe, Streams, Tasks, Time Travel, Zero Copy Cloning) β€’ Proven expertise in Snowflake warehouse management, including sizing, multi-cluster warehouses, workload isolation, concurrency scaling, and cost optimization β€’ Proficiency in SQL and Python for data transformations and orchestration β€’ Experience with cloud platforms: AWS, Azure, or GCP β€’ Experience building robust ELT pipelines and working with structured and semi-structured data (JSON, Parquet, Avro) β€’ Strong knowledge of data modeling for data warehouses (star/snowflake schemas, dimensional modeling) β€’ Experience implementing data governance, security, and access controls in Snowflake (RBAC, masking policies, row access policies) β€’ Experience with Git-based version control and CI/CD pipelines Key Responsibilities β€’ Design, develop, and maintain scalable data pipelines using Snowflake as the core data platform β€’ Build and optimize ELT workflows using tools such as dbt, Airflow, Matillion, or Fivetran β€’ Implement data models (star/snowflake schemas) to support analytics, BI, and advanced analytics use cases β€’ Optimize Snowflake performance and cost (warehouses, clustering, caching, resource monitors) β€’ Integrate data from diverse sources: ERP (SAP), CRM (Salesforce), manufacturing systems, LIMS, IoT, and external data feeds β€’ Ensure data quality, governance, lineage, and metadata management in compliance with regulatory standards (GxP, FDA, ISO) β€’ Collaborate with data analysts, data scientists, product teams, and business stakeholders β€’ Implement CI/CD, version control, and automated testing for data pipelines β€’ Support data security, access controls, and compliance requirements β€’ Participate in architecture reviews and contribute to enterprise data strategy