

GCP Data Engineer - Contract - W2 - No C2C/1099
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer in Phoenix, AZ, offering a long-term W2 contract. Requires 8+ years of experience in Java ETL, big data technologies, and GCP services. Key skills include Java, ETL, and data warehousing.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 18, 2025
π - Project duration
Unknown
-
ποΈ - Location type
On-site
-
π - Contract type
Corp-to-Corp (C2C)
-
π - Security clearance
Unknown
-
π - Location detailed
Phoenix, AZ
-
π§ - Skills detailed
#Data Pipeline #Data Processing #BigQuery #Dataflow #Java #Data Integration #Spark (Apache Spark) #Big Data #Storage #Data Engineering #Apache Beam #"ETL (Extract #Transform #Load)" #Data Warehouse #GCP (Google Cloud Platform) #Cloud #Scala #Data Analysis
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Hi,
Please go through the below requirements and let me know your interest and forward your resume along with your contact information to raja@covetitinc.com
Role : Data Engineer
Location : Phoenix, AZ (Onsite)
Duration : Long Term - Contract
Only W2, No C2C/1099
Need 8+ Years of genuine experience and OPT/CPT visa also fine
Job Description:
We are looking for a skilled Java ETL, Data Engineer to design, build, and maintain scalable data pipelines and data warehousing solutions on Google Cloud Platform (GCP). The ideal candidate will have hands-on experience in Java-based ETL development, big data technologies, and cloud-based data engineering.
Skills: Java, ETL -Big Data, Data Warehouse, GCP
Key Responsibilities:
1. Develop and optimize ETL workflows using Java and big data tools (e.g., Apache Beam, Spark).
1. Design and implement data integration and transformation solutions for large-scale data processing.
1. Work with GCP services such as BigQuery, Dataflow, and Cloud Storage.
1. Collaborate with data analysts, architects, and stakeholders to me