Galent

GCP Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer in Phoenix, AZ, with a contract length of unspecified duration and a competitive pay rate. Key skills include Python, SQL, PySpark, and extensive experience in ETL and data pipelines.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
October 17, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Phoenix, AZ
-
🧠 - Skills detailed
#GitHub #DMS (Data Migration Service) #Storage #PySpark #Programming #Big Data #Data Engineering #Airflow #Hadoop #Scala #SQL (Structured Query Language) #Java #Dataflow #Apache Airflow #Jenkins #Data Pipeline #BigQuery #Python #GCP (Google Cloud Platform) #Cloud #"ETL (Extract #Transform #Load)" #Spark (Apache Spark)
Role description
JD: Job Title: GCP Data Engineer Location: Phoenix AZ (Onsite) Skills: Python SQL Pyspark ETL GCP Must Have 1.Extensive handson experience in objectoriented programming using Python PySpark APIs 2Experience in building data pipelines for huge volume of data 3Deep understanding of ETL concepts data warehousing principles SQL 4Hands on experience in writing basic to advance level of optimized queries using SQL BigQuery 5Hands on experience in developing solutions using Dataflow Dataproc and BigQuery Good to have 1Familiarity with CICD pipelines eg Jenkins Github Actions Skills Mandatory Skills : GCP Storage,GCP BigQuery,GCP DataProc,GCP DMS,Apache Airflow,Java,Python,Scala,GCP Datastream,GCP Analytics Hub,GCP Workflows,GCP Dataform,GCP Datafusion,GCP Pub/Sub,ANSI-SQL,GCP Cloud Composer,GCP Dataflow,Python for DATA,GCP Data Flow,GCP Cloud Pub/Sub,Big Data Hadoop Ecosystem