

Data/Python Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data/Python Engineer, offering £500 - £600 per day for a 3-month contract, fully remote. Key skills include Python, SQL, Snowflake, Airflow, and cloud experience (AWS preferred). Experience in data pipeline automation and governance is essential.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
600
-
🗓️ - Date discovered
September 1, 2025
🕒 - Project duration
3 to 6 months
-
🏝️ - Location type
Remote
-
📄 - Contract type
Outside IR35
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
London, England, United Kingdom
-
🧠 - Skills detailed
#Automation #Cloud #Datasets #Puppet #Batch #Web Scraping #Snowflake #Data Quality #Observability #Scala #SQL (Structured Query Language) #Data Pipeline #AWS (Amazon Web Services) #Data Warehouse #Docker #Data Engineering #Python #Airflow #"ETL (Extract #Transform #Load)" #API (Application Programming Interface) #Terraform
Role description
DATA/PYTHON ENGINEER
£500 - £600 per day Outside IR35
Full Remote
3-month
Company Overview
We are working with a fast-growing digital marketing consultancy working at the intersection of data, media, and technology. Their mission is to help brands understand their customers better and drive performance through data-driven decision-making. With a diverse team of technologists, analysts, and marketing specialists, they deliver cutting-edge solutions across iGaming, Sports, Retail, and Financial Services.
Role Overview & Responsibilities
We are looking for Data Engineers to join a central data engineering function, supporting a high-priority data project following recent team changes. You'll be working end-to-end on extracting, transforming, and automating partner data pipelines to improve user conversion insights and partner reporting.
Day-to-day, you will:
• Design and automate batch ingestion pipelines in Python, ensuring scalability and reliability.
• Extract data from multiple external sources via APIs, and where necessary, web scraping/browser automation (Playwright, Selenium, Puppeteer).
• Orchestrate pipelines using Airflow, and manage data quality workflows.
• Model and transform data in SQL and Snowflake to create clean, analytics-ready datasets.
• Ensure data quality, observability, and governance across workflows.
• Collaborate closely with product managers, analysts, and engineers to deliver high-quality data products for dashboards and reporting.
Technical Skills & Experience
We're looking for candidates who bring:
• Strong hands-on experience with Python for API ingestion, pipeline automation, and data transformation.
• Solid SQL skills with Snowflake (or similar cloud data warehouses).
• Experience with Airflow or other orchestration tools.
• Knowledge of data modelling, warehouse performance optimisation, and governance.
• Cloud experience (AWS preferred; Terraform/Docker a plus).
• Nice-to-have: browser automation/web scraping (Playwright, Selenium, Puppeteer).
If you are interested drop you CV
DATA/PYTHON ENGINEER
£500 - £600 per day Outside IR35
Full Remote
3-month
Company Overview
We are working with a fast-growing digital marketing consultancy working at the intersection of data, media, and technology. Their mission is to help brands understand their customers better and drive performance through data-driven decision-making. With a diverse team of technologists, analysts, and marketing specialists, they deliver cutting-edge solutions across iGaming, Sports, Retail, and Financial Services.
Role Overview & Responsibilities
We are looking for Data Engineers to join a central data engineering function, supporting a high-priority data project following recent team changes. You'll be working end-to-end on extracting, transforming, and automating partner data pipelines to improve user conversion insights and partner reporting.
Day-to-day, you will:
• Design and automate batch ingestion pipelines in Python, ensuring scalability and reliability.
• Extract data from multiple external sources via APIs, and where necessary, web scraping/browser automation (Playwright, Selenium, Puppeteer).
• Orchestrate pipelines using Airflow, and manage data quality workflows.
• Model and transform data in SQL and Snowflake to create clean, analytics-ready datasets.
• Ensure data quality, observability, and governance across workflows.
• Collaborate closely with product managers, analysts, and engineers to deliver high-quality data products for dashboards and reporting.
Technical Skills & Experience
We're looking for candidates who bring:
• Strong hands-on experience with Python for API ingestion, pipeline automation, and data transformation.
• Solid SQL skills with Snowflake (or similar cloud data warehouses).
• Experience with Airflow or other orchestration tools.
• Knowledge of data modelling, warehouse performance optimisation, and governance.
• Cloud experience (AWS preferred; Terraform/Docker a plus).
• Nice-to-have: browser automation/web scraping (Playwright, Selenium, Puppeteer).
If you are interested drop you CV