Glocomms

GCP Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer with a contract length of "X months" and a pay rate of "$Y per hour." Key skills include Google BigQuery, DataOps, Terraform, and Python. Experience with data models, pipelines, and GCP is required.
🌎 - Country
United Kingdom
πŸ’± - Currency
Β£ GBP
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
April 17, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
City Of London, England, United Kingdom
-
🧠 - Skills detailed
#Storage #Infrastructure as Code (IaC) #Compliance #Bash #Shell Scripting #BigQuery #Data Engineering #Cloud #Scala #GCP (Google Cloud Platform) #Data Science #Data Pipeline #Data Warehouse #SQL (Structured Query Language) #dbt (data build tool) #Documentation #Data Lake #Data Quality #Python #Data Integration #Terraform #Scripting #DataOps
Role description
What you will be doing - β€’ Work with business stakeholders (initially Finance and Actuarial teams), data scientists, and engineers to design, build, optimise, and maintain production‑grade data pipelines and reporting based on a modern cloud data warehouse solution. β€’ Collaborate closely with Finance, Actuarial, Data Science, and Engineering teams to identify and maximise the value of new internal and external data sources. β€’ Partner with third‑party delivery vendors to ensure the robust design and engineering of data models, management information (MI), and reporting, supporting organisational growth and scalability. β€’ Take BAU ownership of data models, reporting assets, and data integrations/pipelines. β€’ Create frameworks, infrastructure, and systems to manage and govern enterprise data assets effectively. β€’ Produce detailed technical and functional documentation to enable ongoing BAU support and maintenance of data structures, schemas, pipelines, and reporting. β€’ Work with the wider Engineering community to develop and enhance data platform and MLOps capabilities. β€’ Ensure high data quality, governance, and compliance with internal policies and external regulatory standards. β€’ Proactively monitor, troubleshoot, and resolve data pipeline issues, ensuring reliability, accuracy, and performance of data products. Required skills - Experience designing data models and developing industrialised data pipelines Strong knowledge of database and data lake systems Experience creating and maintaining DataOps pipelines Hands-on experience with Google BigQuery, dbt, and GCP Cloud Storage Knowledge of Cloud SQL and data integration tools such as Airbyte Experience provisioning new infrastructure in a leading cloud provider, preferably Google Cloud Platform (GCP) Proficient in Terraform for infrastructure as code Experience working with Dagster for data pipeline orchestration Comfortable with shell scripting using Bash or similar tools Proficient in Python and SQL Desired Skills and Experience Experience designing data models and developing industrialised data pipelines Strong knowledge of database and data lake systems Experience creating and maintaining DataOps pipelines Hands-on experience with Google BigQuery, dbt, and GCP Cloud Storage Knowledge of Cloud SQL and data integration tools such as Airbyte Experience provisioning new infrastructure in a leading cloud provider, preferably Google Cloud Platform (GCP) Proficient in Terraform for infrastructure as code Experience working with Dagster for data pipeline orchestration Comfortable with shell scripting using Bash or similar tools Proficient in Python and SQL