K&K Global Talent Solutions INC.

GCP Data Engineer with Java Exp.

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer with Java experience in Phoenix, AZ, offering a contract of unspecified length and competitive pay. Candidates must have 4+ years in data engineering, strong Java skills, and hands-on GCP expertise.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
February 12, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Phoenix, AZ
-
🧠 - Skills detailed
#SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Cloud #BigQuery #Data Modeling #Apache Beam #Data Engineering #REST (Representational State Transfer) #Version Control #Computer Science #Java #REST API #Kubernetes #Dataflow #Docker #Microservices #Storage #GIT #GCP (Google Cloud Platform) #Scala #Programming #Data Pipeline #Airflow
Role description
K&K Global Talent Solutions Inc. is an international recruiting agency that has been providing technical resources in the Canada and the USA region since 1993. This position is with one of our clients in USA, who is actively hiring candidates to expand their teams Job Title: GCP Data Engineer with Java Exp. Location: Phoenix, AZ/local only Mode of interview: Face to face Job Summary: We are seeking a highly skilled GCP Data Engineer with strong Java expertise to design, develop, and maintain scalable data pipelines and cloud-based data solutions. The ideal candidate will have hands-on experience with Google Cloud Platform (GCP) services, data engineering best practices, and backend development using Java. Required Skills & Qualifications β€’ Bachelor’s or master’s degree in computer science, Engineering, or related field. β€’ 4+ years of experience in Data Engineering. β€’ Strong programming experience in Java (Java 8 or above). β€’ Hands-on experience with Google Cloud Platform (GCP) services: β€’ BigQuery β€’ Dataflow (Apache Beam) β€’ Pub/Sub β€’ Cloud Storage β€’ Cloud Composer (Airflow) β€’ Cloud Functions / Cloud Run β€’ Experience building ETL/ELT pipelines. β€’ Strong understanding of SQL and data modeling concepts. β€’ Experience with distributed processing frameworks. β€’ Knowledge of REST APIs and microservices architecture. β€’ Familiarity with containerization (Docker) and orchestration (Kubernetes). β€’ Experience with version control systems (Git).