

Data Engineer. Python, Airflow, Contract - Outside IR35
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer on a 6-month contract (remote, UK-based) with a pay rate of £475-£600 per day, requiring 3+ years of experience, expertise in Python, Snowflake, Airflow, and strong SQL skills.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
600
-
🗓️ - Date discovered
August 30, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Remote
-
📄 - Contract type
Outside IR35
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
United Kingdom
-
🧠 - Skills detailed
#Docker #Python #Batch #SQL (Structured Query Language) #Observability #"ETL (Extract #Transform #Load)" #Data Modeling #Data Quality #Kafka (Apache Kafka) #Cloud #Snowflake #Data Pipeline #API (Application Programming Interface) #Terraform #Airflow #AWS (Amazon Web Services) #Automation #Web Scraping #Data Engineering #Puppet
Role description
Data Engineer Python, snowflake, airflow - (Contract)
The Job
As a Data Engineer, you'll be at the heart of building and optimizing our client's data infrastructure. You'll be working on a hugely complex and ambitious project which is the clients number one priority for 2026.
You'll be responsible for:
• Designing, building, and maintaining robust ETL/ELT pipelines and batch/streaming workflows.
• Integrating data from various sources, including external APIs and internal systems, into Snowflake and other downstream tools.
• Utilizing web scraping and browser automation techniques to extract data from platforms lacking traditional API access.
• Taking ownership of critical components within the Airflow-based orchestration layer and Kafka-based event streams.
• Ensuring the highest levels of data quality, reliability, and observability across all data pipelines and platforms.
• Developing shared data tools and frameworks to empower analytics and reporting initiatives.
• Collaborating closely with analysts, product managers, and engineers to enable data-driven decision-making throughout the organization.
About You
We are looking for three people, one senior and 2 less senior engineers. All will be at Individual contributor level.
• 3+ years of experience as a Data Engineer or Software Engineer specializing in data infrastructure.
• Expertise in Python, with proven experience in pulling and managing data from APIs.
• Strong SQL skills and hands-on experience with Snowflake.
• Experience with modern orchestration tools such as Airflow.
• A solid understanding of data modeling, governance, and performance tuning within warehouse environments.
• Proficiency in web scraping via browser automation tools (e.g., Playwright, Selenium, Puppeteer) is a plus.
• Ability to work independently, prioritize effectively, and manage multiple stakeholders.
• Comfortable operating within a cloud-native environment (e.g., AWS, Terraform, Docker).
The Company
Our client is a dynamic and innovative organization working in the affiliate marketing realm, they work with a huge variety of clients across multiple industries creating content and introductions.
Why You Should Do This Job
• Be part of a talented team and contribute to impactful projects.
• Work with cutting-edge technologies in a fast-paced environment.
• Opportunity to make a significant impact on the organization's data capabilities.
• Gain valuable experience in a modern data stack.
• Possibility of extension based on project needs.
Location
Remote (UK Based)
Duration
6 months possible extension
The Money
£475- £600 + per day (Dependent on experience, some flex for the right person) OUTSIDE IR35
What You Should Do Now
If you are a Data Engineer and would like to apply, click the link and send us your resume.
Data Engineer Python, snowflake, airflow - (Contract)
The Job
As a Data Engineer, you'll be at the heart of building and optimizing our client's data infrastructure. You'll be working on a hugely complex and ambitious project which is the clients number one priority for 2026.
You'll be responsible for:
• Designing, building, and maintaining robust ETL/ELT pipelines and batch/streaming workflows.
• Integrating data from various sources, including external APIs and internal systems, into Snowflake and other downstream tools.
• Utilizing web scraping and browser automation techniques to extract data from platforms lacking traditional API access.
• Taking ownership of critical components within the Airflow-based orchestration layer and Kafka-based event streams.
• Ensuring the highest levels of data quality, reliability, and observability across all data pipelines and platforms.
• Developing shared data tools and frameworks to empower analytics and reporting initiatives.
• Collaborating closely with analysts, product managers, and engineers to enable data-driven decision-making throughout the organization.
About You
We are looking for three people, one senior and 2 less senior engineers. All will be at Individual contributor level.
• 3+ years of experience as a Data Engineer or Software Engineer specializing in data infrastructure.
• Expertise in Python, with proven experience in pulling and managing data from APIs.
• Strong SQL skills and hands-on experience with Snowflake.
• Experience with modern orchestration tools such as Airflow.
• A solid understanding of data modeling, governance, and performance tuning within warehouse environments.
• Proficiency in web scraping via browser automation tools (e.g., Playwright, Selenium, Puppeteer) is a plus.
• Ability to work independently, prioritize effectively, and manage multiple stakeholders.
• Comfortable operating within a cloud-native environment (e.g., AWS, Terraform, Docker).
The Company
Our client is a dynamic and innovative organization working in the affiliate marketing realm, they work with a huge variety of clients across multiple industries creating content and introductions.
Why You Should Do This Job
• Be part of a talented team and contribute to impactful projects.
• Work with cutting-edge technologies in a fast-paced environment.
• Opportunity to make a significant impact on the organization's data capabilities.
• Gain valuable experience in a modern data stack.
• Possibility of extension based on project needs.
Location
Remote (UK Based)
Duration
6 months possible extension
The Money
£475- £600 + per day (Dependent on experience, some flex for the right person) OUTSIDE IR35
What You Should Do Now
If you are a Data Engineer and would like to apply, click the link and send us your resume.