

SPECTRAFORCE
Snowflake Developer/Data Engineering (Only on W2)
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake Developer/Data Engineer in Chicago, IL or Tempe, AZ, lasting 12 months. Pay is on W2. Requires 3+ years in Snowflake, AWS, Azure, Python, and data ingestion. Snowflake certification preferred.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
600
-
ποΈ - Date
February 6, 2026
π - Duration
More than 6 months
-
ποΈ - Location
On-site
-
π - Contract
W2 Contractor
-
π - Security
Unknown
-
π - Location detailed
Chicago, IL
-
π§ - Skills detailed
#Automation #API (Application Programming Interface) #Cloud #Documentation #Terraform #BI (Business Intelligence) #Monitoring #Lambda (AWS Lambda) #Data Modeling #DevOps #S3 (Amazon Simple Storage Service) #Computer Science #Data Warehouse #Scala #Kubernetes #Infrastructure as Code (IaC) #Deployment #"ETL (Extract #Transform #Load)" #Batch #GIT #Storage #Data Architecture #Azure #Data Pipeline #Visualization #Data Quality #Data Security #Programming #Data Processing #Docker #Compliance #AWS (Amazon Web Services) #Azure cloud #Snowflake #Security #Version Control #Data Engineering #Python #SQL (Structured Query Language) #Data Ingestion
Role description
Role: Snowflake Developer/Data Engineering (Only on W2)
Duration: 12 months with possibility of extension
Location: Chicago, IL or Tempe, AZ - 3 days onsite per week
Position Overview:
We are seeking an experienced Snowflake Developer to join our data engineering team. The ideal candidate will have strong expertise in Snowflake data platform capabilities, with proven experience in designing and implementing robust data ingestion solutions across both file-based and streaming architectures on AWS and Azure cloud platforms.
Experience Level: Senior Level - 3
Required Qualifications
Technical Skills:
3+ years of hands-on experience with Snowflake data platform including advanced SQL, stored procedures, and performance optimization
Strong experience with data ingestion patterns including bulk loading, micro-batching, and streaming data processing
Proficiency with AWS services such as S3, Lambda, and CloudWatch Experience with Azure data services including Data Factory, Event Hubs, Blob Storage, and Azure Functions
Solid Python programming skills for data processing, API integrations, and automation scripts
Experience with data modeling concepts and dimensional modeling techniques
Understanding of data security, governance, and compliance best practices
Tasks & Responsibilities:
β’ Design, develop, and maintain scalable data pipelines using Snowflake as the core data warehouse platform
β’ Build and optimize data ingestion processes for both batch file-based loads and real-time streaming data from various sources
β’ Implement data transformation logic using Snowflake SQL, stored procedures, and Python integration Collaborate with data architects and analysts to understand business requirements and translate them into technical solutions
β’ Monitor and troubleshoot data pipeline performance, ensuring high availability and data quality
β’ Develop and maintain documentation for data processes, data models, and system architecture
β’ Work closely with DevOps teams to implement CI/CD practices for data pipeline deployments
Professional Experience:
β’ Bachelorβs degree in Computer Science, Information Systems, or related technical field
β’ Minimum 4+ years of experience in data engineering or related roles
β’ Proven track record of delivering production-ready data solutions at scale
β’ Experience with version control systems (Git) and collaborative development practices
Preferred Qualifications:
β’ Snowflake certifications (SnowPro Core or Advanced certifications)
β’ Experience with Infrastructure as Code tools (Terraform, CloudFormation)
β’ Knowledge of containerization technologies (Docker, Kubernetes)
β’ Familiarity with data visualization tools and business intelligence platforms
β’ Experience with data quality frameworks and monitoring tools
Applicant Notices & Disclaimers
β’ For information on benefits, equal opportunity employment, and location-specific applicant notices, click here
Role: Snowflake Developer/Data Engineering (Only on W2)
Duration: 12 months with possibility of extension
Location: Chicago, IL or Tempe, AZ - 3 days onsite per week
Position Overview:
We are seeking an experienced Snowflake Developer to join our data engineering team. The ideal candidate will have strong expertise in Snowflake data platform capabilities, with proven experience in designing and implementing robust data ingestion solutions across both file-based and streaming architectures on AWS and Azure cloud platforms.
Experience Level: Senior Level - 3
Required Qualifications
Technical Skills:
3+ years of hands-on experience with Snowflake data platform including advanced SQL, stored procedures, and performance optimization
Strong experience with data ingestion patterns including bulk loading, micro-batching, and streaming data processing
Proficiency with AWS services such as S3, Lambda, and CloudWatch Experience with Azure data services including Data Factory, Event Hubs, Blob Storage, and Azure Functions
Solid Python programming skills for data processing, API integrations, and automation scripts
Experience with data modeling concepts and dimensional modeling techniques
Understanding of data security, governance, and compliance best practices
Tasks & Responsibilities:
β’ Design, develop, and maintain scalable data pipelines using Snowflake as the core data warehouse platform
β’ Build and optimize data ingestion processes for both batch file-based loads and real-time streaming data from various sources
β’ Implement data transformation logic using Snowflake SQL, stored procedures, and Python integration Collaborate with data architects and analysts to understand business requirements and translate them into technical solutions
β’ Monitor and troubleshoot data pipeline performance, ensuring high availability and data quality
β’ Develop and maintain documentation for data processes, data models, and system architecture
β’ Work closely with DevOps teams to implement CI/CD practices for data pipeline deployments
Professional Experience:
β’ Bachelorβs degree in Computer Science, Information Systems, or related technical field
β’ Minimum 4+ years of experience in data engineering or related roles
β’ Proven track record of delivering production-ready data solutions at scale
β’ Experience with version control systems (Git) and collaborative development practices
Preferred Qualifications:
β’ Snowflake certifications (SnowPro Core or Advanced certifications)
β’ Experience with Infrastructure as Code tools (Terraform, CloudFormation)
β’ Knowledge of containerization technologies (Docker, Kubernetes)
β’ Familiarity with data visualization tools and business intelligence platforms
β’ Experience with data quality frameworks and monitoring tools
Applicant Notices & Disclaimers
β’ For information on benefits, equal opportunity employment, and location-specific applicant notices, click here






