Galent

GCP Data Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Architect on a contract basis in Phoenix, AZ. Required skills include GCP certification, experience with data ingestion and migration, and proficiency in GCP services. Strong problem-solving and communication skills are essential.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
November 4, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Phoenix, AZ
-
🧠 - Skills detailed
#BigQuery #Scala #GCP (Google Cloud Platform) #Data Ingestion #Storage #Dataflow #Migration #Data Migration #Data Modeling #Big Data #Strategy #"ETL (Extract #Transform #Load)" #Cloud #Data Architecture #Data Engineering #Data Governance #Data Quality
Role description
Job Title: GCP Big Data Architect Location: Phoenix, AZ (Onsite/Hybrid preferred) Type: Contract Role Overview We are seeking a highly hands-on GCP Big Data Architect / Senior Engineer to help design and implement the foundational data architecture for our enterprise. The ideal candidate will be a GCP-certified Data Engineer with deep expertise in data ingestion, modeling, and migrationβ€”capable of turning complex business problems into scalable cloud data solutions. Key Responsibilities β€’ Lead the design and development of data domains and data models within the GCP ecosystem. β€’ Build and optimize data ingestion pipelines from diverse data sources. β€’ Drive the GCP data migration strategy, ensuring scalability, performance, and cost optimization. β€’ Collaborate closely with directors and cross-functional teams to translate problem statements into executable technical plans. β€’ Serve as a hands-on technical lead, mentoring junior engineers and ensuring best practices in data architecture and engineering. Required Skills & Experience β€’ GCP Certification (Data Engineer or Architect) is mandatory. β€’ Proven experience building large-scale data platforms and ETL/ELT pipelines in GCP. β€’ Strong hands-on experience with GCP services like BigQuery, Dataflow, Pub/Sub, Cloud Storage, Dataproc, and Composer. β€’ Deep understanding of data modeling, data governance, and data quality frameworks. β€’ Experience leading or architecting GCP migration projects end-to-end. β€’ Excellent problem-solving and communication skills with a proactive, execution-oriented mindset.