V-Soft Consulting Group, Inc.

Snowflake Developer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake Developer based in Chicago, IL or Tempe, AZ (hybrid) for a 6–12-month contract. Key skills include deep Snowflake expertise, AWS and Azure experience, Python proficiency, and data engineering fundamentals.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 20, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Chicago, IL
-
🧠 - Skills detailed
#AWS (Amazon Web Services) #Azure #Dynatrace #ADF (Azure Data Factory) #S3 (Amazon Simple Storage Service) #Microsoft Power BI #"ETL (Extract #Transform #Load)" #SQL Server #Lambda (AWS Lambda) #Data Ingestion #Databases #Aurora #BI (Business Intelligence) #Storage #Azure cloud #Snowflake #SQL (Structured Query Language) #Python #Monitoring #Kafka (Apache Kafka) #Data Engineering #Batch #Data Modeling #Cloud
Role description
Title - Snowflake Developer Location: Chicago, IL or Tempe, AZ (hybrid) Duration: 6–12-month Contract (W2) Position Overview: We are seeking an experienced Snowflake Developer to join our data engineering team. The ideal candidate will have strong expertise in Snowflake data platform capabilities, with proven experience in designing and implementing robust data ingestion solutions across both file-based and streaming architectures on AWS and Azure cloud platforms. Must haves: • Deep, hands-on Snowflake development (not high level experience). • Not just working on snowflake on the cloud or querying but being able to manipulate data from different sources like ADF and Python to bring into AWS/Azure. Determine what data has changed, set up various badge or screening technologies, have deliverable where data is available for end users to share in snowflake or leverage other tools (like power BI) to access data. • AWS + Azure experience — ideally strong in both. • Practical experience with: • AWS services: Lambdas, S3/storage, error monitoring (CloudWatch, Dynatrace), database offerings like Aurora • Azure services: Data Factory, Blob Storage, Functions, SQL Server • Python (for data movement and transformations). • Data engineering fundamentals: Data warehousing, data modeling, ETL/ELT, understanding changed data, and enabling batch/streaming pipelines. • Ability to bring data in from varied sources (APIs, Kafka, flat files, databases) and deliver it to Snowflake or BI tools (Power BI, etc.).