

Quality Choice Solutions
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer position on a 6-month W2 contract in Phoenix, AZ. Requires 6–9 years of experience in Data Engineering, proficiency in GCP, Python, PySpark, SQL, and Big Data frameworks, with functional testing skills essential.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 14, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Phoenix, AZ
-
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Spark SQL #Data Engineering #Spark (Apache Spark) #SQL (Structured Query Language) #Cloud #PySpark #Data Framework #Data Pipeline #Storage #Python #GCP (Google Cloud Platform) #Scala #Big Data #Agile #Data Quality #Data Storage
Role description
We’re looking for a skilled Data Engineer to join our team on a 6-month W2 contract. This is a hybrid role in Phoenix, AZ.
📍 Location: Phoenix, AZ (Hybrid )
📅 Duration: 6 Months Contract
💼 Employment Type: W2 only (No C2C or 1099)
🔍 Must-Have Skills:
6–9 years of hands-on experience in Data Engineering
GCP (Google Cloud Platform)
Python, PySpark, SQL, Hive, Spark
Big Data Frameworks
Functional Testing
Rally (Agile Project Tracking)
🎯 Responsibilities:
Design, build, and maintain scalable data pipelines on GCP
Develop ETL processes using Python, PySpark, and SQL
Manage and optimize data storage solutions (Hive, Big Data tools)
Collaborate with cross-functional teams to ensure data quality and performance
Conduct functional testing for accuracy and reliability
We’re looking for a skilled Data Engineer to join our team on a 6-month W2 contract. This is a hybrid role in Phoenix, AZ.
📍 Location: Phoenix, AZ (Hybrid )
📅 Duration: 6 Months Contract
💼 Employment Type: W2 only (No C2C or 1099)
🔍 Must-Have Skills:
6–9 years of hands-on experience in Data Engineering
GCP (Google Cloud Platform)
Python, PySpark, SQL, Hive, Spark
Big Data Frameworks
Functional Testing
Rally (Agile Project Tracking)
🎯 Responsibilities:
Design, build, and maintain scalable data pipelines on GCP
Develop ETL processes using Python, PySpark, and SQL
Manage and optimize data storage solutions (Hive, Big Data tools)
Collaborate with cross-functional teams to ensure data quality and performance
Conduct functional testing for accuracy and reliability