Bodhi

GCP Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer on a 12-month contract, paying £480–£510 p/day, based in the UK with remote work and occasional office visits. Key skills include BigQuery, SQL, ETL/ELT pipelines, and GCP services. Google Cloud certification required.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
504
-
🗓️ - Date
February 20, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Inside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
Surrey, England, United Kingdom
-
🧠 - Skills detailed
#Programming #Azure #ADF (Azure Data Factory) #Airflow #IAM (Identity and Access Management) #Data Analysis #Data Quality #Microsoft Azure #Redshift #Amazon Redshift #Data Architecture #Apache Beam #"ETL (Extract #Transform #Load)" #Security #Data Processing #Azure Data Factory #Scala #Scripting #Databases #Automation #Java #Shell Scripting #BigQuery #BI (Business Intelligence) #GCP (Google Cloud Platform) #Storage #Data Pipeline #Data Integration #Dataflow #SQL (Structured Query Language) #Monitoring #Python #Data Engineering #Clustering #Data Governance #Data Warehouse #Cloud #Compliance
Role description
JOB TITLE: GCP Engineer LOCATION: UK REMOTE: with Occasional Office Visits to Surrey office PAY RATE:: £480–£510 p/day Inside IR35 12 Month Contract SUBSIDIARY / DEPARTMENT OVERVIEW: The organisation is a globally recognised leader in technology and innovation, delivering advanced digital products and solutions used by millions of people worldwide. With a strong focus on cutting-edge technologies and continuous improvement, the company drives digital transformation across multiple markets. The global software solutions and IT services division plays a key role in delivering enterprise-scale digital capabilities. This position sits within a newly established CDM Operations Team, supporting marketing activities across more than 20 European countries. PURPOSE OF THE JOB: The organisation is seeking a skilled Google Cloud Data Engineer to design, implement, and optimise data solutions within the Google Cloud Platform (GCP) ecosystem. The successful candidate will collaborate with cross-functional teams to ensure effective data integration, governance, and analytics capabilities. KEY ACCOUNTABILITIES 1. Data Architecture & Solution Design • Design scalable, efficient data solutions using BigQuery and other GCP tools to support business intelligence and analytics requirements. • Work closely with stakeholders to gather data requirements and translate them into technical designs. 1. Data Integration & Pipelines • Build, maintain, and optimise ETL/ELT pipelines using tools such as Dataflow, Apache Beam, and Cloud Composer. • Integrate multiple data sources, including APIs, relational databases, and streaming platforms, into BigQuery. 1. BigQuery Optimisation & Performance Tuning • Optimise BigQuery queries and storage structures to ensure high performance and cost efficiency. • Implement partitioning and clustering strategies to enhance query performance. 1. Cloud Infrastructure Management • Configure and manage GCP services such as Cloud Storage, Pub/Sub, and IAM to ensure secure and reliable data operations. • Apply best practices in cloud security and compliance. 1. Data Governance & Quality • Implement data quality and governance frameworks to ensure accuracy, consistency, and availability of data. • Establish monitoring and alerting mechanisms for pipelines and systems to proactively prevent and resolve issues. 1. Collaboration & Support • Partner with data analysts, engineers, and business stakeholders to enable efficient data processing. • Provide technical guidance and support to team members. KEY LIAISONS • Data Engineering Team • Adobe Team • European Regional Office • Headquarters DIMENSIONS • Maintain strong working relationships with all key stakeholders. • Support and align activities with both marketing and operations teams. SKILLS AND EXPERIENCE Essential 1. Language Skills • Exceptional English communication skills, as the role involves collaboration with global teams. 1. Technical Expertise • Strong proficiency in BigQuery and SQL, including data modelling and query optimisation. • Hands-on experience with GCP services such as Cloud Storage, Cloud Composer, Dataflow, and Pub/Sub. • Familiarity with data pipeline frameworks such as Apache Beam and Airflow. • Strong programming skills in Python or Java for data processing and scripting. • Knowledge of shell scripting and cloud automation. • Proven experience designing and managing cloud-based data solutions. • Strong background in developing and maintaining ETL/ELT pipelines. • Demonstrated ability to optimise BigQuery performance and manage cloud costs effectively. • Experience implementing partitioning, clustering, and materialised views. 1. Soft Skills • Excellent analytical and problem-solving abilities. • Strong communication skills, with the ability to explain technical concepts to non-technical stakeholders. • Ability to work collaboratively in a fast-paced, evolving environment. 1. Certifications • Google Cloud Certified: Professional Data Engineer or Associate Cloud Engineer. Desired • Experience with Amazon Redshift for managing and optimising data warehouse solutions across multi-cloud environments. • Experience with Microsoft Azure tools, particularly Azure Data Factory (ADF). CHALLENGE: The organisation operates within a fast-paced and evolving environment where processes and procedures frequently change. The successful candidate must stay up to date with technological developments and assess their potential business impact. Note: This job description outlines the primary responsibilities of the role but does not represent an exhaustive list of duties. It is intended to clarify expectations between the Manager and the employee and may be amended in line with evolving business requirements.