YunoJuno

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer on a 6-month freelance contract, fully remote, with a pay rate of £250 per day. Key skills required include GCP, BigQuery, Apache Airflow, and Python. 5+ years of data engineering experience is essential.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
250
-
🗓️ - Date
May 12, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United Kingdom
-
🧠 - Skills detailed
#Data Pipeline #Data Quality #Schema Design #JavaScript #BigQuery #Data Modeling #Airflow #"ETL (Extract #Transform #Load)" #Documentation #Cloud #Data Warehouse #GCP (Google Cloud Platform) #Python #React #Apache Airflow #Data Engineering
Role description
Data Engineer / 6 month freelance contract / Fully remote (any location) YunoJuno has partnered with a Media Company who are looking to hire a freelance Data Engineer for an upcoming 6 month contract. We're looking for an experienced freelance Data Engineer to join our team on a 6-month contract. You'll be working directly on enhancing and maintaining the cloud-based data warehouse, helping us scale our data infrastructure, improve pipeline reliability, and expand our data collection capabilities. Responsibilities • Enhance and maintain the existing GCP-based data warehouse, including schema design, performance tuning, and cost optimization • Build and manage data pipelines using Apache Airflow and Python • Integrate new data sources, including social media APIs and web crawling pipelines • Collaborate with engineering and product teams to support data needs across the organization • Ensure data quality, reliability, and documentation across all pipelines • Support event-level data collect ion and tracking infrastructure Requirements • 5+ years of data engineering experience • Strong hands-on experience with Google Cloud Platform (GCP), specifically BigQuery • Proficiency with Apache Airflow for pipeline orchestration • Strong Python development skills for ETL/ELT pipeline development • Experience with data modeling, warehousing best practices, and query optimization Nice to Have • Experience with web crawling and scraping at scale • Experience integrating social media APIs (e.g., Twitter/X, LinkedIn, Meta) for data pipeline creation • Familiarity with event-level data collection platforms such as Rudderstack or Segment • JavaScript, Node.js/Express, and React development experience Start date: ASAP Duration: 6 month freelance contract Rate: £250 per day Location: Fully remote