Jobs via Dice

Data Engineer (W2 Contract Role)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer (W2 Contract) for 5+ years of experience, focusing on Python, SQL, Databricks, and Apache Spark. It requires expertise in AWS or Azure, data warehouses, and ETL processes. Work location is USA; W2 engagement is mandatory.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
December 24, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Azure #Data Pipeline #Data Quality #Apache Spark #AWS (Amazon Web Services) #Redshift #Data Engineering #SQL (Structured Query Language) #Datasets #BigQuery #Data Modeling #Snowflake #Python #Databricks #Cloud #"ETL (Extract #Transform #Load)" #Data Warehouse #Spark (Apache Spark)
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Kani Solutions, is seeking the following. Apply via Dice today! Looking for Data Engineer USA We are currently looking for qualified IT professionals on W2 to market their resumes for active and upcoming client requirements across the United States. Selected candidates will be represented by our recruiting team and submitted to direct end clients and prime vendors based on skill alignment and availability. We provide end-to-end support including resume marketing, interview coordination, and ongoing communication throughout the hiring process. Candidates must be open to W2 engagement, client interviews, and project-based assignments. Key Responsibilities: • Design and build end-to-end data pipelines • Develop and optimize ETL/ELT processes • Work with large, structured and unstructured datasets • Ensure data quality, reliability, and performance • Collaborate with analytics, reporting, and business teams Required Skills: • Strong experience in Python and SQL • Hands-on with Databricks, Apache Spark • Experience with Cloud platforms (AWS or Azure) • Data warehouses: Snowflake, Redshift, BigQuery • Knowledge of data modeling and performance tuning Experience: 5+ years (client-flexible) Work Authorization: Must be Authorized to work in United States