

Kastech Software Solutions Group
GCP Big Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Big Data Engineer in Phoenix, AZ, on a long-term contract, offering competitive pay. Requires 14+ years of experience, proficiency in Python, SQL, Spark, and GCP, along with relevant certifications and a Bachelor's degree.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
May 15, 2026
π - Duration
Unknown
-
ποΈ - Location
Hybrid
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Phoenix, AZ
-
π§ - Skills detailed
#Cloud #Data Engineering #"ETL (Extract #Transform #Load)" #Dataflow #Datasets #Computer Science #Security #Data Quality #Big Data #DevOps #Apache Spark #Python #Scala #GCP (Google Cloud Platform) #Spark (Apache Spark) #SQL (Structured Query Language) #BigQuery #Data Pipeline
Role description
Role: GCP Big Data Engineer
Location: Phoenix, AZ β Hybrid/Onsite
Duration: Long term Contract
Job Description
We are looking for a skilled GCP Big Data Engineer with 14+ Years of Experience to design, build, and maintain scalable data pipelines and big data solutions on Google Cloud.
Responsibilities
β’ Develop ETL/ELT pipelines using BigQuery, Cloud Dataflow, and Dataproc
β’ Process large datasets using Apache Spark and SQL
β’ Optimize data workflows, performance, and cloud costs
β’ Collaborate with cross-functional teams for analytics and reporting solutions
β’ Ensure data quality, security, and reliability
Required Skills
β’ Strong experience in Google Cloud
β’ Proficiency in Python, SQL, and Spark
β’ Hands-on experience with BigQuery, Dataflow, and Dataproc
β’ Knowledge of ETL, data warehousing, and cloud architectures
β’ Familiarity with CI/CD and DevOps tools is a plus
Qualification
β’ Bachelorβs degree in computer science or related field
β’ GCP certifications.
Experience
β’ 12+ years of experience in Data Engineering or Big Data technologies.
Role: GCP Big Data Engineer
Location: Phoenix, AZ β Hybrid/Onsite
Duration: Long term Contract
Job Description
We are looking for a skilled GCP Big Data Engineer with 14+ Years of Experience to design, build, and maintain scalable data pipelines and big data solutions on Google Cloud.
Responsibilities
β’ Develop ETL/ELT pipelines using BigQuery, Cloud Dataflow, and Dataproc
β’ Process large datasets using Apache Spark and SQL
β’ Optimize data workflows, performance, and cloud costs
β’ Collaborate with cross-functional teams for analytics and reporting solutions
β’ Ensure data quality, security, and reliability
Required Skills
β’ Strong experience in Google Cloud
β’ Proficiency in Python, SQL, and Spark
β’ Hands-on experience with BigQuery, Dataflow, and Dataproc
β’ Knowledge of ETL, data warehousing, and cloud architectures
β’ Familiarity with CI/CD and DevOps tools is a plus
Qualification
β’ Bachelorβs degree in computer science or related field
β’ GCP certifications.
Experience
β’ 12+ years of experience in Data Engineering or Big Data technologies.






