

Data Engineer (ETL)--W2 Only
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (ETL) on a W2 contract-to-hire basis, requiring 5+ years of experience, preferably 7+. Key skills include Snowflake, Azure cloud (1-2 integrations), and data integration tools like PowerCenter and SnowSQL. Locals preferred.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
August 14, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Unknown
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Minnetonka, MN
-
π§ - Skills detailed
#Data Quality #CLI (Command-Line Interface) #Data Engineering #Data Pipeline #SnowSQL #"ETL (Extract #Transform #Load)" #Datasets #Scala #Cloud #Azure #Snowflake #Data Science #Azure cloud #Data Integration #BI (Business Intelligence)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
T+S
W2 only
this is a Contract to hire role..
Locals Preffred
Top needed skills are: Snowflake, Azure cloud β must have done 1-2 integrations.
Data integration tools : PowerCenter , IDMC , SnowSQL (SnowSQL is the software CLI tool used to interact with Snowflake).
Job Overview
The Data Engineer will develop and manage scalable data pipelines and infrastructure to support data-driven decision-making processes. This role requires collaboration with cross-functional teams to ensure data availability, reliability, and optimal performance. Any interested candidates should have 5+ years of experienceβ¦preferably 7+ years.
Key Responsibilities
β’ Design, implement, and optimize end-to-end data pipelines for ingesting, processing, and transforming large datasets from various sources.
β’ Develop robust ETL (Extract, Transform, Load) processes to integrate data into the organization.
β’ Design and maintain data models, schemas, and database structures to support analytical and business intelligence needs.
β’ Implement processes and systems to monitor data quality, ensuring production data is accurate and available for key stakeholders and business processes.
β’ Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions that meet business objectives.
β’ Identify, design, and implement internal process improvements, such as re-designing infrastructure for greater scalability and optimizing data delivery.
Preferred Tools
β’ Cloud Platforms: Snowflake, Azure (1-2 cloud integrations experience is a must)
β’ Data Integration Tools: PowerCenter, IDMC, snowSQL