

Smobile
GCP Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer in Phoenix, AZ, lasting 12+ months, with a pay rate of $55,276.82 - $70,548.43 per year. Requires 8 years of experience, expertise in Big Data, GCP, PySpark, and advanced SQL skills.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
320
-
ποΈ - Date
March 19, 2026
π - Duration
More than 6 months
-
ποΈ - Location
On-site
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Phoenix, AZ
-
π§ - Skills detailed
#Data Transformations #Spark (Apache Spark) #Data Architecture #"ETL (Extract #Transform #Load)" #Data Engineering #Big Data #PySpark #Scala #Migration #SQL (Structured Query Language) #Data Migration #Data Quality #Cloud #GCP (Google Cloud Platform) #Data Processing #Data Modeling #SQL Queries
Role description
Position β GCP Data Engineer
Location β Phoenix, AZ (Only Local - In-person Interview)
Duration β 12+ Months
Only Local to Arizona profiles will be considered. Should have 8 years of Experience.
Required Qualifications:
We are looking for an experienced Data Engineer (Engineer II β ETL) with strong expertise in Big Data, Data Warehousing, and GCP. The ideal candidate should have hands-on experience in data transformation, strong SQL skills, and extensive experience with PySpark.
Key Responsibilities:
Develop and maintain scalable ETL pipelines on GCP (LUMI platform)
Perform complex data transformations using PySpark
Write optimized and high-performance SQL queries
Work on data backfill processes and large-scale data migrations
Design and implement data models for enterprise data platforms
Ensure data quality, integrity, and governance standards
Collaborate with Data Architects and Business teams
Required Skills:
Strong experience in Big Data & Data Warehousing
Hands-on experience with Google Cloud Platform (GCP)
Expertise in PySpark
Advanced knowledge of SQL
Experience with data modeling
Experience with data backfill and large data processing
Strong understanding of ETL frameworks
Job Type: Contract
Pay: $55,276.82 - $70,548.43 per year
Location:
Phoenix, AZ (Required)
Work Location: In person
Position β GCP Data Engineer
Location β Phoenix, AZ (Only Local - In-person Interview)
Duration β 12+ Months
Only Local to Arizona profiles will be considered. Should have 8 years of Experience.
Required Qualifications:
We are looking for an experienced Data Engineer (Engineer II β ETL) with strong expertise in Big Data, Data Warehousing, and GCP. The ideal candidate should have hands-on experience in data transformation, strong SQL skills, and extensive experience with PySpark.
Key Responsibilities:
Develop and maintain scalable ETL pipelines on GCP (LUMI platform)
Perform complex data transformations using PySpark
Write optimized and high-performance SQL queries
Work on data backfill processes and large-scale data migrations
Design and implement data models for enterprise data platforms
Ensure data quality, integrity, and governance standards
Collaborate with Data Architects and Business teams
Required Skills:
Strong experience in Big Data & Data Warehousing
Hands-on experience with Google Cloud Platform (GCP)
Expertise in PySpark
Advanced knowledge of SQL
Experience with data modeling
Experience with data backfill and large data processing
Strong understanding of ETL frameworks
Job Type: Contract
Pay: $55,276.82 - $70,548.43 per year
Location:
Phoenix, AZ (Required)
Work Location: In person






