Kastech Software Solutions Group

GCP Big Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Big Data Engineer in Phoenix, AZ, on a long-term contract, offering competitive pay. Requires 14+ years of experience, proficiency in Python, SQL, Spark, and GCP, along with relevant certifications and a Bachelor's degree.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
May 15, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Phoenix, AZ
-
🧠 - Skills detailed
#Cloud #Data Engineering #"ETL (Extract #Transform #Load)" #Dataflow #Datasets #Computer Science #Security #Data Quality #Big Data #DevOps #Apache Spark #Python #Scala #GCP (Google Cloud Platform) #Spark (Apache Spark) #SQL (Structured Query Language) #BigQuery #Data Pipeline
Role description
Role: GCP Big Data Engineer Location: Phoenix, AZ – Hybrid/Onsite Duration: Long term Contract Job Description We are looking for a skilled GCP Big Data Engineer with 14+ Years of Experience to design, build, and maintain scalable data pipelines and big data solutions on Google Cloud. Responsibilities β€’ Develop ETL/ELT pipelines using BigQuery, Cloud Dataflow, and Dataproc β€’ Process large datasets using Apache Spark and SQL β€’ Optimize data workflows, performance, and cloud costs β€’ Collaborate with cross-functional teams for analytics and reporting solutions β€’ Ensure data quality, security, and reliability Required Skills β€’ Strong experience in Google Cloud β€’ Proficiency in Python, SQL, and Spark β€’ Hands-on experience with BigQuery, Dataflow, and Dataproc β€’ Knowledge of ETL, data warehousing, and cloud architectures β€’ Familiarity with CI/CD and DevOps tools is a plus Qualification β€’ Bachelor’s degree in computer science or related field β€’ GCP certifications. Experience β€’ 12+ years of experience in Data Engineering or Big Data technologies.