AddanEx International

GCP Data Engineer

โญ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer in Holborn, London, on a 6-month contract with a pay rate of "TBD." Requires 9+ years of experience, expertise in GCP tools, data modeling, SQL, and programming languages like Python or Java.
๐ŸŒŽ - Country
United Kingdom
๐Ÿ’ฑ - Currency
ยฃ GBP
-
๐Ÿ’ฐ - Day rate
Unknown
-
๐Ÿ—“๏ธ - Date
April 22, 2026
๐Ÿ•’ - Duration
More than 6 months
-
๐Ÿ๏ธ - Location
On-site
-
๐Ÿ“„ - Contract
Unknown
-
๐Ÿ”’ - Security
Unknown
-
๐Ÿ“ - Location detailed
London Area, United Kingdom
-
๐Ÿง  - Skills detailed
#Visualization #Dataflow #"ETL (Extract #Transform #Load)" #GCP (Google Cloud Platform) #SQL (Structured Query Language) #Scala #Cloud #Data Pipeline #Azure #Data Engineering #Databases #Terraform #Power Pivot #Microsoft Excel #Programming #DevOps #Java #NoSQL #Python #BigQuery #Automation #Data Processing #RDBMS (Relational Database Management System) #Spark (Apache Spark) #API (Application Programming Interface) #DAX
Role description
AddanEx JOB POSTING: Title: GCP Data Engineer Location: Holborn, London (3x per week onsite) Type: Contract Start Date: ASAP / April - May Duration: 6 Months + Extensions Utilisation: 40 Hours per week Languages: English Requirement โ€ข At least 9+ yearโ€™s work experience โ€ข Take end-to-end responsibility to build, optimize and support of existing and new data products towards the defined target vision โ€ข Be a champion of DevOps mindset and principles and able to manage CI/CD pipelines and terraform as well as Cloud infrastructure, in our context, it is GCP (Google Cloud Platform). โ€ข Evaluate and drive continuous improvement and reducing technical debt in the teams โ€ข Design and implement efficient data models, data pipelines that support analytical requirements. Good understanding of different data modelling techniques and trade-off โ€ข Should have experience with Data Modelling[KG1] โ€ข Experience in data query languages (SQL or similar). Knowledge of ETL processes and tool โ€ข Experience in data centric and API programming (for automation) using one of more programming languages Python, Java /or Scala. โ€ข Knowledge of NoSQL and RDBMS databases โ€ข Experience in different data formats (Avro, Parquet) โ€ข Have a collaborative and co-creative mindset with excellent communication skills โ€ข Motivated to work in an environment that allows you to work and take decisions independently โ€ข Experience in working with data visualization tools โ€ข Experience in GCP tools โ€“ Cloud Function, Dataflow, Dataproc and Bigquery โ€ข Experience in data processing framework โ€“ Beam, Spark, Hive, Flink โ€ข GCP (or/& Azure) data engineering certification is a merit โ€ข Have hands on experience in Analytical tools such as powerBI or similar visualization tools โ€ข Exhibit understanding in creating intermediate-level DAX measures to enhance data models and visualizations โ€ข Have understanding of Microsoft excel functions such as: power pivot, power query. Tabular Editor, DAX etc.[GA2] โ€ข Fluent in English both written and verbal