

Queen Square Recruitment
GCP Data Architect
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Architect, contracted for 6 months, located in Bristol (Hybrid). Key skills include cloud-native data architecture, data governance, and experience in BFSI domains. GCP Professional certification is desirable. IR35 status is inside.
π - Country
United Kingdom
π± - Currency
Β£ GBP
-
π° - Day rate
Unknown
-
ποΈ - Date
February 17, 2026
π - Duration
More than 6 months
-
ποΈ - Location
Hybrid
-
π - Contract
Inside IR35
-
π - Security
Unknown
-
π - Location detailed
City Of Bristol, England, United Kingdom
-
π§ - Skills detailed
#ML (Machine Learning) #Data Management #Scala #Observability #Agile #Unit Testing #Looker #Data Governance #Data Pipeline #Data Architecture #Data Engineering #Migration #Metadata #Dynatrace #Deployment #Data Lineage #"ETL (Extract #Transform #Load)" #Documentation #ML Ops (Machine Learning Operations) #GCP (Google Cloud Platform) #Monitoring #Cloud
Role description
New Exciting Opportunity for GCP Data Architect
Location: Bristol ( Hybrid- 2 days from Office)
Contract Duration: 6 Months
IR35 Status: Inside IR35
The Role
This role will play a critical part in architecting and modernising cloud-native data platforms on Google Cloud Platform for a major financial services environment. You will design scalable, secure, resilient, and cost-optimised data solutions that serve high-volume, high-regulatory workloads. The role provides exposure to large-scale transformation programmes, modern engineering practices, mature data governance frameworks, and access to advanced GCP services, enabling you to influence enterprise-wide data strategies.
Essential skills/experience:
Β· Identifying the correct architecture patterns for workload deployment and ensuring governance
Β· Managing data lineage and quality aspects of DPs
Β· Understanding and defining SRE principles such as SLOs, SLIs, and SLAs for each workload
Β· Supporting feature teams across the lab in optimizing FinOps for live workloads
Β· Developing and deploying data pipelines using various GCP services
Β· Implementing TDD for unit testing and DBB for functional testing
Β· Reviewing and creating FinOps dashboards for the lab
Β· Creating custom Dynatrace monitoring and alerting
Β· Integrating Looker and enabling reporting for end-users within general guardrails
β’ Experience in designing high-volume ingestion, transformation, and analytics pipelines.
β’ Strong knowledge of data governance, lineage, metadata management, and regulatory controls.
β’ Demonstrated ability to work with cross-functional Agile teams and influence technical direction.
β’ Experience executing cloud migration or platform modernisation programs.
β’ Excellent architectural documentation, communication, and stakeholder engagement skills.
Desirable skills/experience:
β’ GCP Professional Data Engineer or Cloud Architect certification
β’ Experience in BFSI domains such as AML, Fraud, Risk, Finance, or Regulatory Reporting.
β’ Exposure to real-time streaming, ML Ops, and advanced analytics architectures.
β’ Experience with platform observability tools and cloud cost management framework
New Exciting Opportunity for GCP Data Architect
Location: Bristol ( Hybrid- 2 days from Office)
Contract Duration: 6 Months
IR35 Status: Inside IR35
The Role
This role will play a critical part in architecting and modernising cloud-native data platforms on Google Cloud Platform for a major financial services environment. You will design scalable, secure, resilient, and cost-optimised data solutions that serve high-volume, high-regulatory workloads. The role provides exposure to large-scale transformation programmes, modern engineering practices, mature data governance frameworks, and access to advanced GCP services, enabling you to influence enterprise-wide data strategies.
Essential skills/experience:
Β· Identifying the correct architecture patterns for workload deployment and ensuring governance
Β· Managing data lineage and quality aspects of DPs
Β· Understanding and defining SRE principles such as SLOs, SLIs, and SLAs for each workload
Β· Supporting feature teams across the lab in optimizing FinOps for live workloads
Β· Developing and deploying data pipelines using various GCP services
Β· Implementing TDD for unit testing and DBB for functional testing
Β· Reviewing and creating FinOps dashboards for the lab
Β· Creating custom Dynatrace monitoring and alerting
Β· Integrating Looker and enabling reporting for end-users within general guardrails
β’ Experience in designing high-volume ingestion, transformation, and analytics pipelines.
β’ Strong knowledge of data governance, lineage, metadata management, and regulatory controls.
β’ Demonstrated ability to work with cross-functional Agile teams and influence technical direction.
β’ Experience executing cloud migration or platform modernisation programs.
β’ Excellent architectural documentation, communication, and stakeholder engagement skills.
Desirable skills/experience:
β’ GCP Professional Data Engineer or Cloud Architect certification
β’ Experience in BFSI domains such as AML, Fraud, Risk, Finance, or Regulatory Reporting.
β’ Exposure to real-time streaming, ML Ops, and advanced analytics architectures.
β’ Experience with platform observability tools and cloud cost management framework






