Maxwell Bondยฎ

Data Engineer

โญ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (Python) with a 6-month contract, paying outside IR35. It requires strong Python and SQL skills, experience with large datasets, and a focus on data quality. Work is hybrid, based in Manchester.
๐ŸŒŽ - Country
United Kingdom
๐Ÿ’ฑ - Currency
ยฃ GBP
-
๐Ÿ’ฐ - Day rate
440
-
๐Ÿ—“๏ธ - Date
April 11, 2026
๐Ÿ•’ - Duration
More than 6 months
-
๐Ÿ๏ธ - Location
Hybrid
-
๐Ÿ“„ - Contract
Outside IR35
-
๐Ÿ”’ - Security
Unknown
-
๐Ÿ“ - Location detailed
Manchester Area, United Kingdom
-
๐Ÿง  - Skills detailed
#S3 (Amazon Simple Storage Service) #Data Quality #Athena #AI (Artificial Intelligence) #Data Engineering #REST (Representational State Transfer) #Spark (Apache Spark) #SQL (Structured Query Language) #Datasets #"ETL (Extract #Transform #Load)" #ML (Machine Learning) #Documentation #Python #Data Pipeline #AWS (Amazon Web Services)
Role description
Data Engineer (Python) 6 months initial 2 days a week onsite in Manchester Outside IR35 This organisation delivers data-driven insights to media and market research clients. Its platform provides actionable data solutions to global brands, helping them make informed, strategic decisions. The Role You will join the Data Products team and take ownership of building and evolving behavioural data products. You will turn raw data signals into structured, high-quality datasets. You will use Python and SQL to define schemas, apply business logic, enforce quality checks, and deliver client-ready data feeds. The role follows a hybrid model with two days in the office each week and the rest remote. Key Responsibilities โ€ข Design and evolve data product schemas, translating requirements into well-structured datasets โ€ข Build and maintain data feeds using Python and SQL for transformation, validation, and delivery โ€ข Monitor and improve data quality through checks, diagnostics, and analysis of large datasets โ€ข Work with Product, Engineering, Apps, and ML teams to deliver new features and improvements โ€ข Maintain clear, accurate documentation for data products and processes About You You enjoy working close to data and solving how real-world behaviour translates into clean, reliable datasets. Required Skills โ€ข Strong Python skills with experience writing production-quality data pipelines โ€ข Solid SQL skills with experience handling large datasets โ€ข Strong attention to detail with a focus on data quality and validation โ€ข Clear communicator who works well across teams โ€ข Comfortable using AI tools to improve productivity and learning Nice to Have โ€ข Experience with AWS data tools such as S3, Spark, Athena, or workflow orchestration tools โ€ข Exposure to AI-assisted workflows Why Join You will join a team that values growth, collaboration, and continuous improvement, with support to develop your skills and progress your career.