SThree

Senior Data Engineer (GCP & Analytics) - Hybrid - London

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer (GCP & Analytics) in London, offering a hybrid work model. It requires 9+ years of data engineering experience, strong SQL skills, GCP expertise, and familiarity with analytics tools like Power BI.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
April 24, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
London, England, United Kingdom
-
🧠 - Skills detailed
#Data Pipeline #Data Engineering #DAX #"ETL (Extract #Transform #Load)" #Azure cloud #Data Framework #Terraform #GCP (Google Cloud Platform) #Scala #SQL (Structured Query Language) #Azure #Power Pivot #Visualization #API (Application Programming Interface) #BI (Business Intelligence) #Python #Security #Databases #Java #Cloud #DevOps #Microsoft Power BI #RDBMS (Relational Database Management System) #Spark (Apache Spark) #Dataflow #NoSQL #BigQuery
Role description
Responsibilities We are looking for a Senior Data Engineer to take end-to-end ownership of modern data products within a cloud-based analytics platform. You will design, build, optimize, and support scalable data pipelines while championing DevOps practices, technical quality, and continuous improvement. Key Responsibilities: β€’ Own the full lifecycle of data products (build, optimize, support) β€’ Design efficient data models and pipelines for analytics use cases β€’ Drive DevOps best practices (CI/CD, Terraform, cloud infrastructure) β€’ Reduce technical debt and improve engineering standards β€’ Develop data-centric and API-based solutions using Python, Java, or Scala β€’ Collaborate closely with analytics, BI, and business stakeholders β€’ Support reporting and visualization use cases Required Experience β€’ 9+ years experience in data engineering β€’ Strong data modelling and SQL expertise β€’ Experience with ETL/ELT, RDBMS, and NoSQL databases β€’ Hands-on experience with GCP: BigQuery, Dataflow, Dataproc, Cloud Functions β€’ Experience with data frameworks: Spark, Beam, Hive, or Flink β€’ Familiarity with data formats such as Parquet and Avro Analytics & BI β€’ Experience with Power BI or similar tools β€’ Ability to build intermediate DAX measures β€’ Working knowledge of Power Query, Power Pivot, Excel analytics Nice To Have β€’ GCP or Azure Data Engineering certification β€’ Azure cloud experience Interested?Share your updated CV via L.barisic(at)globalenterprisepartners.com or call +31205305801 Let op: vacaturefraude Helaas komt vacaturefraude steeds vaker voor. We waarschuwen je voor mogelijke misleiding: β€’ Wij zullen nooit via WhatsApp of in een videogesprek vragen om jouw persoonlijke gegevens (zoals een kopie van je ID, bankgegevens of BSN). β€’ Twijfel je over de echtheid van een vacature of contactpersoon? Neem dan altijd rechtstreeks contact met ons op via de officiΓ«le contactgegevens op onze website. Important: job fraud Unfortunately, job fraud is becoming more common. Beware of such scams: β€’ We will never ask for personal information (such as a copy of your ID, bank details, or social security number) via WhatsApp or during a video call. β€’ If you're unsure whether a vacancy or contact person is legitimate, please reach out to us directly using the official contact details on our website.