Gravitas Recruitment Group

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer on a 3–6 month contract, paying £350/€400 per day, remote in the UK or EU. Key skills include GCP, BigQuery, dbt, Airflow, Docker, and Kubernetes; experience in data governance is essential.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
350
-
🗓️ - Date
March 19, 2026
🕒 - Duration
3 to 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Outside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
England, United Kingdom
-
🧠 - Skills detailed
#Apache Airflow #Data Quality #Security #Scala #Consulting #Monitoring #Cloud #Datasets #Apache Beam #Data Governance #Airflow #Batch #"ETL (Extract #Transform #Load)" #Docker #Documentation #Data Processing #Clustering #Deployment #dbt (data build tool) #Data Engineering #SQL (Structured Query Language) #BigQuery #Dataflow #GCP (Google Cloud Platform) #Kubernetes #Macros
Role description
Location: UK or EU Remote Rate: £350 per day/€400 per day Contract: Outside IR35, 3–6 month initial contract Overview We are seeking a Data Engineer to join a consulting engagement, supporting the design, build, and optimisation of robust data platforms and pipelines. You will work closely with analysts, engineers, and stakeholders to deliver reliable, well-governed datasets and enable data-driven decision-making across client environments. Key Responsibilities • Design, develop, and maintain scalable ELT/ETL pipelines on Google Cloud Platform. • Model, transform, and test data using dbt; implement best practices for modular, maintainable projects. • Build and optimise data solutions in BigQuery, including partitioning, clustering, performance tuning, and cost control. • Orchestrate workflows using Apache Airflow, including scheduling, monitoring, alerting, and retries. • Develop batch/stream processing solutions using Google Dataflow where appropriate. • Containerise services with Docker and support deployments in Kubernetes environments. • Implement data quality checks, lineage, documentation, and CI/CD practices. • Collaborate with stakeholders to gather requirements and translate them into technical deliverables. Essential Requirements • Proven experience as a Data Engineer delivering production-grade pipelines and data models. • Strong hands-on experience with GCP, including BigQuery. • Solid dbt experience (models, tests, macros, documentation) and SQL expertise. • Commercial experience with Airflow for orchestration and operational support. • Working knowledge of Google Dataflow (Apache Beam) for data processing. • Experience with Docker and Kubernetes in modern data/platform environments. • Good understanding of data governance, security, and access controls. • Strong communication skills and the ability to work effectively in a consulting setting. Desirable • 1 year experience working with mobile data would be beneficial. • Experience integrating multiple data sources (APIs, event data, relational and semi-structured formats).