Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer position for a 16-week contract in Reading, paying £375 per day. Key skills required include Google Cloud Platform, SQL, ETL processes, and experience with Tableau. Immediate start with hybrid work model.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
375
-
🗓️ - Date discovered
September 3, 2025
🕒 - Project duration
3 to 6 months
-
🏝️ - Location type
Hybrid
-
📄 - Contract type
Inside IR35
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Reading, England, United Kingdom
-
🧠 - Skills detailed
#GCP (Google Cloud Platform) #Microsoft Power BI #Version Control #"ETL (Extract #Transform #Load)" #Agile #Scala #PostgreSQL #GIT #BigQuery #Datasets #SQL (Structured Query Language) #Data Engineering #Data Pipeline #BI (Business Intelligence) #dbt (data build tool) #Scrum #Tableau #Migration #Cloud
Role description
Join a leading telecommunications company as a Data Engineer! Be part of a major transformation initiative, helping to modernise tech stacks and reporting infrastructure. This is a fantastic opportunity to work with cutting-edge cloud technologies, collaborate within an agile scrum team, and make a real impact on how data is sourced, transformed, and visualized across the business. Job Overview We are looking for a highly skilled and proactive Data Engineer to support a large-scale transformation project. The role involves migrating legacy systems to modern cloud-based platforms, integrating data from multiple sources, and enabling advanced reporting capabilities. You’ll be working within a dedicated scrum team, supporting the transition from Power BI to Tableau, and helping to build scalable, high-performance data pipelines on Google Cloud Platform. Contract Details • Location: Reading (hybrid – twice per month on-site) • Rate: £375 per day through umbrella • Contract Length: 16 weeks (with visibility to extend) Key Responsibilities • Design, build, and maintain scalable data pipelines using GCP-native tools. • Extract, clean, and transform data from at least 10 disparate sources. • Support the migration of reporting tools from Power BI to Tableau. • Collaborate with analysts, developers, and stakeholders to deliver robust data solutions. • Ensure high performance and optimization of data queries and pipelines. • Work within a 3-week sprint agile framework, contributing to continuous delivery. Key Requirements • Proven experience with Google Cloud Platform, especially BigQuery and Dataform/DBT. • Strong SQL skills (PostgreSQL preferred) and experience with large datasets. • Background in ETL/ELT processes and data modelling. • Experience integrating data for BI tools, particularly Tableau. • Familiarity with Git and version control practices. • Ability to communicate technical concepts to non-technical stakeholders. Start Date: ASAP