

Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with expertise in GCP, offering £500-£550 per day on a 6-month contract. Key skills include ETL, Terraform, Ansible, and proficiency in Python, Go, and BASH. Hybrid work requires 2 days onsite in Osterley.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
550
-
🗓️ - Date discovered
July 17, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Hybrid
-
📄 - Contract type
Inside IR35
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
London, England, United Kingdom
-
🧠 - Skills detailed
#BigQuery #Dataflow #Scripting #Ansible #GitLab #Data Pipeline #Infrastructure as Code (IaC) #Cloud #GCP (Google Cloud Platform) #Python #Data Engineering #"ETL (Extract #Transform #Load)" #Scala #Jenkins #Bash #Shell Scripting #Terraform #Linux
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
About
Data Engineer - GCP | £500-£550 per day | Inside IR35 6-Month Contract | Hybrid (2 Days Onsite - Osterley) 83zero are partnered with a leading media and broadcasting organisation on the lookout for a skilled Data Engineer to join their Data & Analytics team on an initial 6-month contract. This role is perfect for someone with strong ETL expertise, deep experience in Google Cloud Platform (GCP), and a passion for building scalable, cloud-native data pipelines. You'll work with cutting-edge tech in a fast-paced environment, helping to deliver critical insights and analytics to the business. What You'll Be Doing: Designing and developing scalable ETL pipelines to process and deliver large volumes of data. Working hands-on with GCP services including BigQuery, Pub/Sub, and Dataflow. Automating infrastructure using Terraform, Ansible, and CI/CD tooling. Writing clean, efficient code in Python, Go, and BASH. Supporting and maintaining a secure Linux-based data engineering environment. Collaborating with stakeholders to ensure data pipelines meet business needs and SLAs. What We're Looking For: Proven experience in data engineering with a strong focus on cloud-based ETL workflows. Solid background with Google Cloud Platform (GCP) and associated data tools. Skilled in Infrastructure as Code - Terraform and Ansible preferred. Confident working with CI/CD pipelines (Jenkins, GitLab CI, GoCD, etc.). Proficient in Python, Go, and shell scripting (BASH). Strong Linux system administration skills. Ability to work 2 days per week onsite in Osterley
Nice-to-have skills
• ETL
• Google Cloud Platform
• Terraform
• Ansible
• Python
• Go
• Bash
• Linux
• London, England, United Kingdom
Work experience
• Data Engineer
• Data Infrastructure
Languages
• English