

Programmers.io
GCP Data Architect
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Architect with a contract length of "Unknown," offering a pay rate of "Unknown." Key skills include GCP expertise, data architecture, SQL, Python, and data governance. A Bachelor's/Master's degree and 7-10 years of experience are required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
October 29, 2025
π - Duration
Unknown
-
ποΈ - Location
Unknown
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Texas, United States
-
π§ - Skills detailed
#Data Pipeline #Data Quality #Data Warehouse #SQL (Structured Query Language) #Data Governance #Terraform #Data Engineering #Data Management #Cloud #DataOps #Batch #Snowflake #Infrastructure as Code (IaC) #Data Strategy #Dataflow #GCP (Google Cloud Platform) #Strategy #Deployment #BigQuery #Python #Scala #Migration #Data Architecture #Security #Dimensional Modelling #Computer Science #"ETL (Extract #Transform #Load)" #AI (Artificial Intelligence) #Storage #Metadata #Java
Role description
The GCP Data Architect will play a central role in designing, building and maintaining robust, scalable and secure data platforms on the Google Cloud Platform. This individual will partner with business stakeholders, analytics teams, and engineering functions to define data strategy, architecture, governance and technical delivery of high-impact data solutions. The role demands hands-on expertise as well as strategic vision to transform legacy systems, enable advanced analytics and support data-driven decision-making across the organisation.
Key Responsibilities
β’ Define and lead the end-to-end data architecture on GCP: ingestion, processing, storage, analytics and consumption layers.
β’ Collaborate with business and technical stakeholders to translate business requirements into scalable technical datasolution designs.
β’ Design and implement data pipelines (batch and real-time) using GCP services (e.g., BigQuery, Cloud Storage, Dataflow, Pub/Sub, Dataproc).
β’ Develop and enforce data modelling standards (conceptual, logical, physical) and design schemas for data warehouses, lakes or lakehouses.
β’ Establish and maintain data governance, metadata management, lineage, data quality, and security frameworks.
β’ Lead migration of on-premises or other cloud data platforms to GCP, including legacy system modernisation.
β’ Optimize system performance, cost, scalability, reliability and security of the data platform.
β’ Mentor and guide engineering and analytics teams in best practices for Cloud data, DataOps/MLOps and infrastructure as code (IaC).
β’ Contribute to technical and pre-sales engagements: solution proposals, architectural reviews, white-board sessions, technical workshops.
β’ Stay current with GCP innovations, emerging data patterns (data mesh, data fabric, lakehouse) and integrate relevant technologies.
Required Qualifications
β’ Bachelorβs or Masterβs degree in Computer Science, Information Technology, Engineering, or related field.
β’ Minimum 7-10 years of data engineering/architecture experience, including at least 5 years focused on GCP or equivalent cloud-native data platforms.
β’ Deep expertise in GCP services: BigQuery, Cloud Storage, Dataflow, Pub/Sub, Dataproc, Composer/Cloud Composer, Vertex AI (preferred) etc.
β’ Strong data modelling skills: dimensional modelling, star/snowflake schemas, conceptual/logical/physical models.
β’ Proficient in SQL, Python (or Scala/Java) and data transformation techniques.
β’ Experience with ETL/ELT frameworks, real-time streaming, batch processing, distributed systems.
β’ Knowledge of Infrastructure as Code (IaC) tools such as Terraform or Deployment Manager and CI/CD pipelines for data solutions.
β’ Proven ability to define and implement data governance, metadata management and security frameworks.
β’ Excellent communication skills: able to engage with both technical teams and business stakeholders, present architectural decisions and lead workshops.
β’ Certifications such as Google Cloud Professional Data Engineer or Google Cloud Professional Cloud Architect are highly desirable.
The GCP Data Architect will play a central role in designing, building and maintaining robust, scalable and secure data platforms on the Google Cloud Platform. This individual will partner with business stakeholders, analytics teams, and engineering functions to define data strategy, architecture, governance and technical delivery of high-impact data solutions. The role demands hands-on expertise as well as strategic vision to transform legacy systems, enable advanced analytics and support data-driven decision-making across the organisation.
Key Responsibilities
β’ Define and lead the end-to-end data architecture on GCP: ingestion, processing, storage, analytics and consumption layers.
β’ Collaborate with business and technical stakeholders to translate business requirements into scalable technical datasolution designs.
β’ Design and implement data pipelines (batch and real-time) using GCP services (e.g., BigQuery, Cloud Storage, Dataflow, Pub/Sub, Dataproc).
β’ Develop and enforce data modelling standards (conceptual, logical, physical) and design schemas for data warehouses, lakes or lakehouses.
β’ Establish and maintain data governance, metadata management, lineage, data quality, and security frameworks.
β’ Lead migration of on-premises or other cloud data platforms to GCP, including legacy system modernisation.
β’ Optimize system performance, cost, scalability, reliability and security of the data platform.
β’ Mentor and guide engineering and analytics teams in best practices for Cloud data, DataOps/MLOps and infrastructure as code (IaC).
β’ Contribute to technical and pre-sales engagements: solution proposals, architectural reviews, white-board sessions, technical workshops.
β’ Stay current with GCP innovations, emerging data patterns (data mesh, data fabric, lakehouse) and integrate relevant technologies.
Required Qualifications
β’ Bachelorβs or Masterβs degree in Computer Science, Information Technology, Engineering, or related field.
β’ Minimum 7-10 years of data engineering/architecture experience, including at least 5 years focused on GCP or equivalent cloud-native data platforms.
β’ Deep expertise in GCP services: BigQuery, Cloud Storage, Dataflow, Pub/Sub, Dataproc, Composer/Cloud Composer, Vertex AI (preferred) etc.
β’ Strong data modelling skills: dimensional modelling, star/snowflake schemas, conceptual/logical/physical models.
β’ Proficient in SQL, Python (or Scala/Java) and data transformation techniques.
β’ Experience with ETL/ELT frameworks, real-time streaming, batch processing, distributed systems.
β’ Knowledge of Infrastructure as Code (IaC) tools such as Terraform or Deployment Manager and CI/CD pipelines for data solutions.
β’ Proven ability to define and implement data governance, metadata management and security frameworks.
β’ Excellent communication skills: able to engage with both technical teams and business stakeholders, present architectural decisions and lead workshops.
β’ Certifications such as Google Cloud Professional Data Engineer or Google Cloud Professional Cloud Architect are highly desirable.






