

Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer, remote and UK-based, with a contract length of 6 months and a pay rate of £400 - £530 per day outside IR35. Requires 3+ years of data engineering experience, strong Python and SQL skills, and familiarity with Airflow and cloud environments.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
530
-
🗓️ - Date discovered
August 29, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Remote
-
📄 - Contract type
Outside IR35
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
London, England, United Kingdom
-
🧠 - Skills detailed
#Automation #Puppet #"ETL (Extract #Transform #Load)" #Observability #Airflow #Batch #AWS (Amazon Web Services) #Cloud #Snowflake #Terraform #Data Quality #Data Engineering #Python #SQL (Structured Query Language) #Web Scraping #Kafka (Apache Kafka)
Role description
Data Engineer – HIRING ASAP – Outside IR35
Start date: ASAP
Duration: 6 months initially with a view to extend
Location: Remote (Need to be UK Based)
Rate: £400 - £530 per day outside ir35
Responsibilities
• Design, build, and maintain ETL/ELT pipelines and batch/streaming workflows.
• Integrate data from external APIs and internal systems into Snowflake and downstream tools.
• Use web scraping / browser automation to pull data from platforms that only have UI based data extract capabilities (no APIs)
• Own critical parts of our Airflow-based orchestration layer and Kafka-based event streams.
• Ensure data quality, reliability, and observability across our pipelines and platforms.
• Build shared data tools and frameworks to support analytics and reporting use cases.
• Partner closely with analysts, product managers, and other engineers to support data-driven decisions.
Key Skills
• 3+ years of experience as a Data Engineer working on data infrastructure.
• Strong Python skills and hands-on experience with SQL
• Experience with modern orchestration tools like Airflow.
• Experience with APIs and extracting data from APIs.
• Understanding of data modelling, governance, and performance tuning in warehouse environments.
• Comfort operating in a cloud-native environment like AWS.
• Terraform experience.
Nice to have
• Snowflake
• Web scraping via browser automation (playwright / selenium / puppeteer for example)
Data Engineer – HIRING ASAP – Outside IR35
Start date: ASAP
Duration: 6 months initially with a view to extend
Location: Remote (Need to be UK Based)
Rate: £400 - £530 per day outside ir35
Responsibilities
• Design, build, and maintain ETL/ELT pipelines and batch/streaming workflows.
• Integrate data from external APIs and internal systems into Snowflake and downstream tools.
• Use web scraping / browser automation to pull data from platforms that only have UI based data extract capabilities (no APIs)
• Own critical parts of our Airflow-based orchestration layer and Kafka-based event streams.
• Ensure data quality, reliability, and observability across our pipelines and platforms.
• Build shared data tools and frameworks to support analytics and reporting use cases.
• Partner closely with analysts, product managers, and other engineers to support data-driven decisions.
Key Skills
• 3+ years of experience as a Data Engineer working on data infrastructure.
• Strong Python skills and hands-on experience with SQL
• Experience with modern orchestration tools like Airflow.
• Experience with APIs and extracting data from APIs.
• Understanding of data modelling, governance, and performance tuning in warehouse environments.
• Comfort operating in a cloud-native environment like AWS.
• Terraform experience.
Nice to have
• Snowflake
• Web scraping via browser automation (playwright / selenium / puppeteer for example)