

Retelligence
Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer (Contract) in London (Hybrid) for 3 months at £450-550 per day. Key skills include expert Python, advanced SQL, and extensive GCP experience. Proven track record in data engineering projects is required.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
550
-
🗓️ - Date
March 26, 2026
🕒 - Duration
3 to 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Outside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Documentation #Leadership #Stories #GCP (Google Cloud Platform) #BigQuery #Version Control #SQL (Structured Query Language) #SQL Queries #Data Orchestration #Automated Testing #Data Engineering #Scala #Cloud #Datasets #GIT #Dataflow #Storage #Data Architecture #"ETL (Extract #Transform #Load)" #Data Processing #Airflow #Python
Role description
Senior GCP Data Engineer (Contract)
Location: London (Hybrid)
Rate: £450-550 per day
IR35 Status: Outside IR35
Duration: 3 Months (Initial + likely extension)
The Opportunity
We are looking for a high-calibre Senior GCP Data Engineer to join one of London's fastest-growing technology success stories. Following a period of exceptional performance and record-breaking growth, the company is scaling its data infrastructure to support global operations.
The Role
As a Senior Data Engineer, you will be a key architect and builder of our modern data platform. You will be responsible for:
• Pipeline Engineering: Designing and implementing robust, scalable ETL/ELT pipelines to ingest data from a variety of internal and external sources.
• Infrastructure: Leveraging the full power of Google Cloud Platform to ensure high availability and performance of our data environment.
• Optimization: Refining and optimizing complex SQL queries and Python scripts to ensure efficient data processing and cost management.
• Architecture: Collaborating with stakeholders to define data models and ensure the data architecture supports the company's rapid scaling.
• Best Practices: Championing engineering excellence through CI/CD, automated testing, and comprehensive documentation.
Your Tech Stack
• Language: Expert-level Python and advanced SQL.
• Cloud: Extensive experience with GCP (BigQuery, Cloud Storage, Dataflow/PubSub, Cloud Composer/Airflow).
• Tools: Experience with data orchestration tools and version control (Git).
• Environment: Proficiency in building production-grade pipelines.
What We’re Looking For
• A proven track record of delivering end-to-end data engineering projects on GCP.
• Senior-level experience in managing complex datasets and distributed systems.
• A "delivery-first" mindset with the ability to work independently in a fast-paced, high-growth environment.
• Excellent communication skills to translate technical concepts for non-technical leadership.
Senior GCP Data Engineer (Contract)
Location: London (Hybrid)
Rate: £450-550 per day
IR35 Status: Outside IR35
Duration: 3 Months (Initial + likely extension)
The Opportunity
We are looking for a high-calibre Senior GCP Data Engineer to join one of London's fastest-growing technology success stories. Following a period of exceptional performance and record-breaking growth, the company is scaling its data infrastructure to support global operations.
The Role
As a Senior Data Engineer, you will be a key architect and builder of our modern data platform. You will be responsible for:
• Pipeline Engineering: Designing and implementing robust, scalable ETL/ELT pipelines to ingest data from a variety of internal and external sources.
• Infrastructure: Leveraging the full power of Google Cloud Platform to ensure high availability and performance of our data environment.
• Optimization: Refining and optimizing complex SQL queries and Python scripts to ensure efficient data processing and cost management.
• Architecture: Collaborating with stakeholders to define data models and ensure the data architecture supports the company's rapid scaling.
• Best Practices: Championing engineering excellence through CI/CD, automated testing, and comprehensive documentation.
Your Tech Stack
• Language: Expert-level Python and advanced SQL.
• Cloud: Extensive experience with GCP (BigQuery, Cloud Storage, Dataflow/PubSub, Cloud Composer/Airflow).
• Tools: Experience with data orchestration tools and version control (Git).
• Environment: Proficiency in building production-grade pipelines.
What We’re Looking For
• A proven track record of delivering end-to-end data engineering projects on GCP.
• Senior-level experience in managing complex datasets and distributed systems.
• A "delivery-first" mindset with the ability to work independently in a fast-paced, high-growth environment.
• Excellent communication skills to translate technical concepts for non-technical leadership.






