

VeriiPro
GCP Engineer | Full-Time
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Engineer with over 6 years of experience, including 4+ years in GCP data engineering, focusing on application migration. Key skills include BigQuery, PySpark, Python, ETL/ELT tools, and data orchestration. Contract duration exceeds 6 months.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 11, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Unknown
-
📄 - Contract
Fixed Term
-
🔒 - Security
Unknown
-
📍 - Location detailed
Phoenix, AZ
-
🧠 - Skills detailed
#Data Storage #GCP (Google Cloud Platform) #PySpark #Python #Scripting #Data Orchestration #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Migration #Agile #Spark (Apache Spark) #Cloud #Airflow #Shell Scripting #Storage #Data Engineering #BigQuery
Role description
Must Have Technical/Functional Skills
• GCP Engineer with Bigquery, Pyspark and Python experience
Roles & Responsibilities
• 6+ years of professional experience with at least 4+ years of GCP Data Engineer experience
• Experience working on GCP application Migration for a large enterprise
• Hands-on Experience with Google Cloud Platform (GCP)
• Extensive experience with ETL/ELT tools and data transformation frameworks
• Working knowledge of data storage solutions like BigQuery or Cloud SQL
• Solid skills in data orchestration tools like AirFlow or Cloud Workflows.
• Familiarity with Agile development methods.
• Hands-on experience with Spark, Python, and PySpark APIs.
• Knowledge of various Shell Scripting tools
Must Have Technical/Functional Skills
• GCP Engineer with Bigquery, Pyspark and Python experience
Roles & Responsibilities
• 6+ years of professional experience with at least 4+ years of GCP Data Engineer experience
• Experience working on GCP application Migration for a large enterprise
• Hands-on Experience with Google Cloud Platform (GCP)
• Extensive experience with ETL/ELT tools and data transformation frameworks
• Working knowledge of data storage solutions like BigQuery or Cloud SQL
• Solid skills in data orchestration tools like AirFlow or Cloud Workflows.
• Familiarity with Agile development methods.
• Hands-on experience with Spark, Python, and PySpark APIs.
• Knowledge of various Shell Scripting tools