

Tekskills Inc.
Azure Data Engineer -only W2
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Azure Data Engineer with 8+ years of experience, focusing on Azure Data Factory, Snowflake, and SQL. It is a 12+ month onsite position in Milwaukee, WI, requiring expertise in data ingestion and transformation frameworks.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 3, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Milwaukee, WI
-
🧠 - Skills detailed
#Azure Data Factory #Cloud #Snowflake #PySpark #Big Data #Data Bricks #"ETL (Extract #Transform #Load)" #Data Engineering #Data Ingestion #SQL (Structured Query Language) #ADF (Azure Data Factory) #Azure #Spark (Apache Spark) #Data Lake
Role description
Job Title: Azure Data Engineer
Location: Milwaukee, WI 53209 (Onsite)
Duration: 12+ months
Minimum years of experience 8 plus years
Job Details:
Must Have Skills
• Azure Data Factory (ADF)
• Snowflake (SF)
• SQL
Nice to have skills
• Pyspark
• Cloud Architecture
Detailed Job Description
• Should have 8 years of experience implementing Azure Data Factory ADF with knowledge of data bricks also.
• Has to work on pipeline, data Ingestion framework and data transformation framework.
• Experience in Azure platform, Big Data, Cloud technologies, snowflake. Possess knowledge on Data Lake, and reporting tools..
• Work closely with Architect, Infosys Cloud Architect, and JCI Ingestion team to create and deliver solution.
• Has to take the technical responsibility for all upgrade remediation.
Job Title: Azure Data Engineer
Location: Milwaukee, WI 53209 (Onsite)
Duration: 12+ months
Minimum years of experience 8 plus years
Job Details:
Must Have Skills
• Azure Data Factory (ADF)
• Snowflake (SF)
• SQL
Nice to have skills
• Pyspark
• Cloud Architecture
Detailed Job Description
• Should have 8 years of experience implementing Azure Data Factory ADF with knowledge of data bricks also.
• Has to work on pipeline, data Ingestion framework and data transformation framework.
• Experience in Azure platform, Big Data, Cloud technologies, snowflake. Possess knowledge on Data Lake, and reporting tools..
• Work closely with Architect, Infosys Cloud Architect, and JCI Ingestion team to create and deliver solution.
• Has to take the technical responsibility for all upgrade remediation.




