

Galent
GCP Data Architect
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Architect on a contract basis in Phoenix, AZ. Required skills include GCP certification, experience with data ingestion and migration, and proficiency in GCP services. Strong problem-solving and communication skills are essential.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
November 4, 2025
π - Duration
Unknown
-
ποΈ - Location
Hybrid
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Phoenix, AZ
-
π§ - Skills detailed
#BigQuery #Scala #GCP (Google Cloud Platform) #Data Ingestion #Storage #Dataflow #Migration #Data Migration #Data Modeling #Big Data #Strategy #"ETL (Extract #Transform #Load)" #Cloud #Data Architecture #Data Engineering #Data Governance #Data Quality
Role description
Job Title: GCP Big Data Architect
Location: Phoenix, AZ (Onsite/Hybrid preferred)
Type: Contract
Role Overview
We are seeking a highly hands-on GCP Big Data Architect / Senior Engineer to help design and implement the foundational data architecture for our enterprise. The ideal candidate will be a GCP-certified Data Engineer with deep expertise in data ingestion, modeling, and migrationβcapable of turning complex business problems into scalable cloud data solutions.
Key Responsibilities
β’ Lead the design and development of data domains and data models within the GCP ecosystem.
β’ Build and optimize data ingestion pipelines from diverse data sources.
β’ Drive the GCP data migration strategy, ensuring scalability, performance, and cost optimization.
β’ Collaborate closely with directors and cross-functional teams to translate problem statements into executable technical plans.
β’ Serve as a hands-on technical lead, mentoring junior engineers and ensuring best practices in data architecture and engineering.
Required Skills & Experience
β’ GCP Certification (Data Engineer or Architect) is mandatory.
β’ Proven experience building large-scale data platforms and ETL/ELT pipelines in GCP.
β’ Strong hands-on experience with GCP services like BigQuery, Dataflow, Pub/Sub, Cloud Storage, Dataproc, and Composer.
β’ Deep understanding of data modeling, data governance, and data quality frameworks.
β’ Experience leading or architecting GCP migration projects end-to-end.
β’ Excellent problem-solving and communication skills with a proactive, execution-oriented mindset.
Job Title: GCP Big Data Architect
Location: Phoenix, AZ (Onsite/Hybrid preferred)
Type: Contract
Role Overview
We are seeking a highly hands-on GCP Big Data Architect / Senior Engineer to help design and implement the foundational data architecture for our enterprise. The ideal candidate will be a GCP-certified Data Engineer with deep expertise in data ingestion, modeling, and migrationβcapable of turning complex business problems into scalable cloud data solutions.
Key Responsibilities
β’ Lead the design and development of data domains and data models within the GCP ecosystem.
β’ Build and optimize data ingestion pipelines from diverse data sources.
β’ Drive the GCP data migration strategy, ensuring scalability, performance, and cost optimization.
β’ Collaborate closely with directors and cross-functional teams to translate problem statements into executable technical plans.
β’ Serve as a hands-on technical lead, mentoring junior engineers and ensuring best practices in data architecture and engineering.
Required Skills & Experience
β’ GCP Certification (Data Engineer or Architect) is mandatory.
β’ Proven experience building large-scale data platforms and ETL/ELT pipelines in GCP.
β’ Strong hands-on experience with GCP services like BigQuery, Dataflow, Pub/Sub, Cloud Storage, Dataproc, and Composer.
β’ Deep understanding of data modeling, data governance, and data quality frameworks.
β’ Experience leading or architecting GCP migration projects end-to-end.
β’ Excellent problem-solving and communication skills with a proactive, execution-oriented mindset.





