UNICOM Technologies Inc

Data Engineering with GCP -Own Our W2

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineering position with GCP, onsite in Phoenix, offering $50/hr on W2 for a contract length of unspecified duration. Candidates must have 5-8 years of experience in GCP, strong ETL skills, and familiarity with Oracle Fusion Cloud.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
400
-
πŸ—“οΈ - Date
October 17, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
W2 Contractor
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Phoenix, AZ
-
🧠 - Skills detailed
#Python #GCP (Google Cloud Platform) #PySpark #Cloud #Big Data #Data Ingestion #Data Engineering #"ETL (Extract #Transform #Load)" #Oracle #HiveQL #Airflow #Spark (Apache Spark) #Data Processing
Role description
RoleΒ - Data Engineering with GCP On our W2 Number of Openings– 1 Opening LocationΒ - Onsite Phoenix (PHX) RateΒ - $50/hr on W2 VisaΒ - Only U.S. Citizens and local candidates will be accepted Note: If we have two strong profiles with slightly higher rates, will share them as well. I'll check with the client for consideration. However, try to stay within the mentioned rate. This is the JD received from the lead owner. Ex-OnX candidates can be submitted, but please ensure to mention the reason for leaving OnX. We need to share the resumes by the eod on Thursday, 10/16/2025. Must have 5-8 years of Data Engineering experience in a GCP Big Data environment: β€’ Strong experience working in GCP β€’ Strong Data transformation/ETL skills using Spark, Python, PySpark, HiveQL β€’ Must have hands on experience building data ingestion pipelines from Oracle Recruitment Cloud to a Big Data Environment cloud or on-prem β€’ Must have strong hands-on experience on data processing tasks using Spark/Python for cleansing and curating to populate analytical models β€’ Familiarity with data models and business processes in Recruitment & Learning modules of Oracle Fusion Cloud β€’ Data Savvy with strong analytical skills β€’ Capable of independently delivering work items and leading data discussions with Tech Leads, Architects, and Implementation Partners β€’ Prefer experience with CI/CD tools and code management processes β€’ Having prior experience on transition from Taleo to Oracle Fusion Cloud – nice to have Must have solid hand on SKILL experience with: β€’ GCP – β€’ Big Query, β€’ Airflow, β€’ GCS, β€’ Python, β€’ Spark, β€’ DataProc, β€’ Data transformation/ETL skills, β€’ Oracle Fusion – (Recruitment or Learn)