

MM Flowers
Data Engineer - 12 Month FTC
β - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer position on a 12-month FTC, offering competitive pay. Key skills include proficiency in Python, SQL, and Azure tools. Candidates should have 3+ years of relevant experience and familiarity with ETL processes and data governance.
π - Country
United Kingdom
π± - Currency
Β£ GBP
-
π° - Day rate
Unknown
-
ποΈ - Date
November 12, 2025
π - Duration
More than 6 months
-
ποΈ - Location
Unknown
-
π - Contract
Fixed Term
-
π - Security
Unknown
-
π - Location detailed
Alconbury, England, United Kingdom
-
π§ - Skills detailed
#"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Microsoft Power BI #Compliance #GIT #Data Engineering #BI (Business Intelligence) #Snowflake #SQL Server #MDM (Master Data Management) #Data Management #Storage #Terraform #MS SQL (Microsoft SQL Server) #GitHub #Observability #Synapse #Data Lake #Azure #SQL Queries #Datasets #Data Quality #Data Pipeline #Cloud #Version Control #IAM (Identity and Access Management) #MLOAD (MultiLoad) #Data Governance #Strategy #Python #ADF (Azure Data Factory) #Oracle #Documentation
Role description
As a Data Engineer at MM Flowers, youβll help drive our digital transformation by expanding and improving our data infrastructure that powers financial and operational insight. The role will be a key enabler on the delivery of significant technology focused projects, and the continual development of the digital landscape across the business.
Weβre looking to grow our data team with a skilled, proactive Data Engineer who is keen to establish strong data governance, observability practices-ensuring datasets are versioned, catalogued and fully traceable from source to output while sharing their knowledge within the data team.
Working closely with the IT Team, vendors and business stakeholders, the Data Engineer will implement a data warehousing solution that will enable better use of data and reporting driving business benefits in alignment with the wider business strategy.
The ideal candidate will bring experience working with complex, datasets, a strong grasp of modern data engineering tools and best practices, and the curiosity to solve problems. Youβll be hands-on, adaptable, and motivated to create your own data pipelines and other data solutions
This is a huge opportunity to utilise your previous experience to assist in the successfully delivery of business transformation.
β’ Design, build and operate ELT/ETL pipelines using Microsoft based tools (Data Factory, Fabric).
β’ Maintain a medallion architecture (BronzeβGold) for trusted, refined datasets.
β’ Develop, optimize, and maintain complex SQL queries to support analytics and reporting requirements.
β’ Implement data quality, testing and observability; ensure lineage, accuracy and compliance.
β’ Enable self-serve analytics through well-documented models and transformation logic.
β’ Integrate internal/external sources.
β’ Manage data infrastructure (warehouses, data lakes, storage); tune performance and monitor health.
β’ Troubleshoot incidents, run root-cause analysis and deploy fixes and provide technical support.
β’ You will apply best practices for data quality, testing, and observability, helping to ensure the data delivered to stakeholders is accurate and trustworthy.
β’ Contribute to CI/CD practices, documentation and engineering standards.
β’ Partner cross-functionally to deliver fit-for-purpose data solutions.
β’ Proactively identify opportunities for continuous improvement.
What you can already do
β’ Minimum 3 yearsβ experience in data engineering, data analytics, and BI.
β’ Proficiency in Python and SQL languages.
β’ Experience in delivering technology projects within a fast-paced business, medium sized organisations.
β’ Deliver solutions within the appropriate framework and methodology whilst ensuring the supportability of services delivered.
β’ Experience working with ETL (Extract-Transform-Load) Pipelines.
β’ Proven experience building and operating pipelines on Azure (ADF, Synapse, Fabric)
β’ Familiarity with version control systems (Git, GitHub) and CI/CD best practices
β’ Excellent understanding of Power BI Service and Fabric
β’ Strong grasp of data modelling and warehousing concepts such as MS SQL Server, Oracle and Snowflake.
β’ Knowledge of Infrastructure-as-Code (e.g., Terraform), identity and secrets management (IAM), and cloud cost optimization at scale
β’ Knowledge of information principles, processes, and Master Data Management (MDM).
β’ Good understanding of strategic and emerging technology trends, and the practical application of existing and emerging technologies to new and evolving business and operating models.
β’ Understanding of IT standards and controls
β’ Excellent communication skills at all levels.
β’ Strong problem-solving skills.
β’ Inquisitive nature and interest in project work.
β’ Ability to work under own initiative.
β’ Ability to work under pressure.
β’ Collaborative working style and strong interpersonal skills.
β’ High attention to detail.
β’ Self-motivated and a logical thinker.
As a Data Engineer at MM Flowers, youβll help drive our digital transformation by expanding and improving our data infrastructure that powers financial and operational insight. The role will be a key enabler on the delivery of significant technology focused projects, and the continual development of the digital landscape across the business.
Weβre looking to grow our data team with a skilled, proactive Data Engineer who is keen to establish strong data governance, observability practices-ensuring datasets are versioned, catalogued and fully traceable from source to output while sharing their knowledge within the data team.
Working closely with the IT Team, vendors and business stakeholders, the Data Engineer will implement a data warehousing solution that will enable better use of data and reporting driving business benefits in alignment with the wider business strategy.
The ideal candidate will bring experience working with complex, datasets, a strong grasp of modern data engineering tools and best practices, and the curiosity to solve problems. Youβll be hands-on, adaptable, and motivated to create your own data pipelines and other data solutions
This is a huge opportunity to utilise your previous experience to assist in the successfully delivery of business transformation.
β’ Design, build and operate ELT/ETL pipelines using Microsoft based tools (Data Factory, Fabric).
β’ Maintain a medallion architecture (BronzeβGold) for trusted, refined datasets.
β’ Develop, optimize, and maintain complex SQL queries to support analytics and reporting requirements.
β’ Implement data quality, testing and observability; ensure lineage, accuracy and compliance.
β’ Enable self-serve analytics through well-documented models and transformation logic.
β’ Integrate internal/external sources.
β’ Manage data infrastructure (warehouses, data lakes, storage); tune performance and monitor health.
β’ Troubleshoot incidents, run root-cause analysis and deploy fixes and provide technical support.
β’ You will apply best practices for data quality, testing, and observability, helping to ensure the data delivered to stakeholders is accurate and trustworthy.
β’ Contribute to CI/CD practices, documentation and engineering standards.
β’ Partner cross-functionally to deliver fit-for-purpose data solutions.
β’ Proactively identify opportunities for continuous improvement.
What you can already do
β’ Minimum 3 yearsβ experience in data engineering, data analytics, and BI.
β’ Proficiency in Python and SQL languages.
β’ Experience in delivering technology projects within a fast-paced business, medium sized organisations.
β’ Deliver solutions within the appropriate framework and methodology whilst ensuring the supportability of services delivered.
β’ Experience working with ETL (Extract-Transform-Load) Pipelines.
β’ Proven experience building and operating pipelines on Azure (ADF, Synapse, Fabric)
β’ Familiarity with version control systems (Git, GitHub) and CI/CD best practices
β’ Excellent understanding of Power BI Service and Fabric
β’ Strong grasp of data modelling and warehousing concepts such as MS SQL Server, Oracle and Snowflake.
β’ Knowledge of Infrastructure-as-Code (e.g., Terraform), identity and secrets management (IAM), and cloud cost optimization at scale
β’ Knowledge of information principles, processes, and Master Data Management (MDM).
β’ Good understanding of strategic and emerging technology trends, and the practical application of existing and emerging technologies to new and evolving business and operating models.
β’ Understanding of IT standards and controls
β’ Excellent communication skills at all levels.
β’ Strong problem-solving skills.
β’ Inquisitive nature and interest in project work.
β’ Ability to work under own initiative.
β’ Ability to work under pressure.
β’ Collaborative working style and strong interpersonal skills.
β’ High attention to detail.
β’ Self-motivated and a logical thinker.






