Stott and May

Data Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Architect, offering a 6-month contract at £552 per day, hybrid in Bristol. Key skills include GCP data architecture, data governance, and SRE principles. Experience in financial services and cloud migration is essential.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
552
-
🗓️ - Date
February 17, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Inside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
Greater Bristol Area, United Kingdom
-
🧠 - Skills detailed
#Data Management #Scala #Observability #Agile #Unit Testing #Data Governance #Looker #Data Pipeline #Data Architecture #Data Engineering #Strategy #Migration #BigQuery #Metadata #Security #Dynatrace #Deployment #Data Lineage #"ETL (Extract #Transform #Load)" #Data Strategy #Data Quality #Documentation #Compliance #GCP (Google Cloud Platform) #IAM (Identity and Access Management) #Dataflow #Monitoring #Physical Data Model #Cloud
Role description
Job Title: GCP Data Architect Location: Bristol (Hybrid – 2 days per week in the office) Day Rate: £552 per day (Inside IR35) Contract Duration: 6 months The Role We are seeking an experienced GCP Data Architect to play a critical role in architecting and modernising cloud-native data platforms on Google Cloud Platform within a major financial services environment. The successful candidate will design scalable, secure, resilient and cost-optimised data solutions supporting high-volume, highly regulated workloads. This role offers exposure to large-scale transformation programmes, modern engineering practices, mature data governance frameworks and advanced GCP services, with the opportunity to influence enterprise-wide data strategy. Key Responsibilities • Architect end-to-end data solutions using GCP services including BigQuery, Dataflow, Pub/Sub, Dataproc, GCS and Composer. • Design conceptual, logical and physical data models across complex risk, operations, analytics and regulatory domains. • Build scalable ingestion and transformation frameworks with strong emphasis on data quality, lineage, metadata and auditability. • Identify appropriate cloud architecture patterns for workload deployment and ensure governance standards are upheld. • Define and enforce security best practices, IAM policies and data protection standards. • Develop and deploy data pipelines using multiple GCP services. • Manage data lineage and data quality aspects of data products. • Define and implement SRE principles including SLIs, SLOs and SLAs for workloads. • Lead cost optimisation initiatives and support FinOps optimisation for live workloads. • Review and create FinOps dashboards to monitor cloud spend and efficiency. • Provide architectural governance, reusable frameworks and technical oversight to engineering teams. • Support Agile feature teams across the lab environment to optimise performance and reliability. • Drive cloud modernisation and legacy-to-cloud migration initiatives. • Conduct proof of concepts and evaluate emerging tools to enhance cloud data capabilities. • Ensure compliance with regulatory, audit and enterprise data governance requirements. • Create custom monitoring and alerting solutions using Dynatrace. • Integrate Looker and enable reporting solutions for end-users within defined guardrails. • Implement TDD for unit testing and DBB for functional testing. Essential Skills & Experience • Strong experience architecting high-volume ingestion, transformation and analytics pipelines on GCP. • Deep knowledge of data governance, lineage, metadata management and regulatory controls. • Proven experience identifying appropriate architecture patterns and enforcing governance standards. • Experience managing data lineage and quality across enterprise data platforms. • Strong understanding of SRE principles and workload reliability engineering. • Experience supporting FinOps optimisation and cloud cost management initiatives. • Hands-on experience developing and deploying data pipelines using GCP services. • Experience implementing TDD and DBB testing approaches. • Strong stakeholder engagement and ability to influence technical direction within cross-functional Agile teams. • Experience delivering cloud migration or platform modernisation programmes. • Excellent documentation and communication skills. Desirable Skills & Experience • GCP Professional Data Engineer or Cloud Architect certification. • Experience within BFSI domains such as AML, Fraud, Risk, Finance or Regulatory Reporting. • Exposure to real-time streaming, MLOps and advanced analytics architectures. • Experience with observability tooling and cloud cost management frameworks.