

Acetech Group Corporation
Snowflake Developer W2 Open for Sponsorship
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake Developer in Chicago, IL, on a long-term contract. Key skills include Snowflake Development, Kafka, MS Azure, and Databricks. Experience in data integration, ETL processes, and data modeling is required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 27, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Chicago, IL
-
🧠 - Skills detailed
#Databases #"ETL (Extract #Transform #Load)" #Data Architecture #Data Modeling #Documentation #Oracle #Python #Snowflake #Airflow #Scripting #Azure #Data Integration #Kafka (Apache Kafka) #Databricks #Data Pipeline
Role description
Role: Snowflake Developer
Location: Chicago, IL (Onsite)
Duration: Long Term Contract
Top Skills:
1. Snowflake Development
1. Kafka
1. MS Azure
1. Databricks
Key Responsibilities:
Data Integration:
Implement and maintain data synchronization between on-premises Oracle databases and Snowflake using Kafka and CDC tools.
Support Data Modeling:
Assist in developing and optimizing the data model for Snowflake, ensuring it supports our analytics and reporting requirements.
Data Pipeline Development:
Design, build, and manage data pipelines for the ETL process, using Airflow for orchestration and Python for scripting, to transform raw data into a format suitable for our new Snowflake data model.
Reporting Support:
Collaborate with data architect to ensure the data within Snowflake is structured in a way that supports efficient and insightful reporting.
Technical Documentation:
Create and maintain comprehensive documentation of data pipelines, ETL processes, and data models to ensure best practices are followed and knowledge is shared within the team.
Role: Snowflake Developer
Location: Chicago, IL (Onsite)
Duration: Long Term Contract
Top Skills:
1. Snowflake Development
1. Kafka
1. MS Azure
1. Databricks
Key Responsibilities:
Data Integration:
Implement and maintain data synchronization between on-premises Oracle databases and Snowflake using Kafka and CDC tools.
Support Data Modeling:
Assist in developing and optimizing the data model for Snowflake, ensuring it supports our analytics and reporting requirements.
Data Pipeline Development:
Design, build, and manage data pipelines for the ETL process, using Airflow for orchestration and Python for scripting, to transform raw data into a format suitable for our new Snowflake data model.
Reporting Support:
Collaborate with data architect to ensure the data within Snowflake is structured in a way that supports efficient and insightful reporting.
Technical Documentation:
Create and maintain comprehensive documentation of data pipelines, ETL processes, and data models to ensure best practices are followed and knowledge is shared within the team.






