

SPG Resourcing
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer on an initial 6-month contract, hybrid in Manchester, paying £550-£600 per day. Requires strong experience in greenfield data environments, SQL, cloud platforms (AWS/Azure/GCP), and infrastructure as code (Terraform).
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
640
-
🗓️ - Date
January 10, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Outside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
Manchester Area, United Kingdom
-
🧠 - Skills detailed
#Batch #AI (Artificial Intelligence) #DevOps #Infrastructure as Code (IaC) #ML (Machine Learning) #Data Ingestion #Cloud #Data Pipeline #Microsoft Power BI #Terraform #Data Quality #Azure #BI (Business Intelligence) #Data Engineering #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #AWS (Amazon Web Services) #Data Architecture #GCP (Google Cloud Platform) #Scala #GDPR (General Data Protection Regulation)
Role description
Contract Data Engineer
Greenfield Data Platform
Outside IR35 | £550-£600 per day
Initial 6 months | Start ASAP
Location: Hybrid Manchester (1-2 days per week)
Overview
You will be working with a supply chain management consultancy building a greenfield data platform from the ground up. Data preparation is currently manual, and spreadsheet driven posing clear delivery risks and scalability limitations. The new platform will manage ingestion of customer data, standardisation, and allow for processing and analytics outputs.
The organisation is now establishing its target data architecture internally. This Engineer will work alongside the architect to define the core data model, and build robust, scalable pipelines and data assets that operationalise that architecture.
This is a hands-on Data Engineering role with a strong emphasis on implementation quality, performance, and reliability in a greenfield environment.
Key Responsibilities
• Build and optimise data ingestion pipelines for varied client data sources including APIs, file uploads, batch feeds, streaming, and CDC
• Implement data models and transformations in line with defined architectural and modelling standards
• Engineer scalable ETL pipelines handling very large data volumes
• Work closely with architect to translate logical designs into production ready solutions
• Develop and maintain data quality, validation, and enrichment processes
• Support AI and ML driven use cases for data identification, categorisation, and enrichment
• Ensure data pipelines align with GDPR and ISO requirements
• Contribute to infrastructure as code practices alongside platform and DevOps teams
• Support downstream analytics and reporting use cases including Power BI
Environment
• Fully greenfield data estate
• Cloud agnostic architecture with early AWS prototyping
• Highly variable third party client data
• Microsoft 365 for internal operations
• Power BI used in a consultant driven delivery model
Required Experience
• Strong Senior Data Engineering experience on modern data platforms
• Proven delivery on greenfield or early stage data environments
• Hands on experience building scalable ingestion and transformation pipelines
• Strong SQL and data transformation capability
• Cloud data engineering experience across AWS, Azure, or GCP
• Familiarity with infrastructure as code, ideally Terraform
• Comfortable working autonomously while collaborating closely with architecture and product stakeholders
• Pragmatic, delivery focused mindset with strong attention to data quality and performance
Contract Data Engineer
Greenfield Data Platform
Outside IR35 | £550-£600 per day
Initial 6 months | Start ASAP
Location: Hybrid Manchester (1-2 days per week)
Overview
You will be working with a supply chain management consultancy building a greenfield data platform from the ground up. Data preparation is currently manual, and spreadsheet driven posing clear delivery risks and scalability limitations. The new platform will manage ingestion of customer data, standardisation, and allow for processing and analytics outputs.
The organisation is now establishing its target data architecture internally. This Engineer will work alongside the architect to define the core data model, and build robust, scalable pipelines and data assets that operationalise that architecture.
This is a hands-on Data Engineering role with a strong emphasis on implementation quality, performance, and reliability in a greenfield environment.
Key Responsibilities
• Build and optimise data ingestion pipelines for varied client data sources including APIs, file uploads, batch feeds, streaming, and CDC
• Implement data models and transformations in line with defined architectural and modelling standards
• Engineer scalable ETL pipelines handling very large data volumes
• Work closely with architect to translate logical designs into production ready solutions
• Develop and maintain data quality, validation, and enrichment processes
• Support AI and ML driven use cases for data identification, categorisation, and enrichment
• Ensure data pipelines align with GDPR and ISO requirements
• Contribute to infrastructure as code practices alongside platform and DevOps teams
• Support downstream analytics and reporting use cases including Power BI
Environment
• Fully greenfield data estate
• Cloud agnostic architecture with early AWS prototyping
• Highly variable third party client data
• Microsoft 365 for internal operations
• Power BI used in a consultant driven delivery model
Required Experience
• Strong Senior Data Engineering experience on modern data platforms
• Proven delivery on greenfield or early stage data environments
• Hands on experience building scalable ingestion and transformation pipelines
• Strong SQL and data transformation capability
• Cloud data engineering experience across AWS, Azure, or GCP
• Familiarity with infrastructure as code, ideally Terraform
• Comfortable working autonomously while collaborating closely with architecture and product stakeholders
• Pragmatic, delivery focused mindset with strong attention to data quality and performance






