

Senior DataOps Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior DataOps Data Engineer, offering an 18-month FTC with a pay rate of £55,000-£67,000. Located in Leeds (2/3 days hybrid), it requires expertise in Azure Data Platform, CI/CD pipelines, and data governance.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
282.6818181818
-
🗓️ - Date discovered
July 26, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Hybrid
-
📄 - Contract type
Fixed Term
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Leeds, England, United Kingdom
-
🧠 - Skills detailed
#Data Engineering #Data Pipeline #Automation #Data Science #Azure Databricks #Vault #Programming #Compliance #Agile #Delta Lake #Data Quality #Data Governance #ADLS (Azure Data Lake Storage) #Deployment #Bash #Microsoft Azure #SQL (Structured Query Language) #Monitoring #Azure Data Factory #Observability #DataOps #Security #Azure ADLS (Azure Data Lake Storage) #GDPR (General Data Protection Regulation) #Python #Spark (Apache Spark) #Terraform #Infrastructure as Code (IaC) #Data Security #PySpark #Storage #Azure #Data Lake #Databricks #DevOps #Data Lifecycle #Azure SQL #ADF (Azure Data Factory)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Senior Data Ops Data EngineerLeeds 2/3 days per week£55,000-£67,000+ Excellent Benefits18 Months FTC with long term potential
Your New CompanyOur large public sector organisation are seeking a senior Data Platform DataOps Engineer to serve as our clients first DataOps specialist in a growing team of Data Engineers and DevOps professionals.In this pivotal role, you will focus on operationalising and automating their data lifecycle to ensure that data workflows perform with reliability and efficiency. You will integrate CI/CD data pipelines, streamline deployment processes, enforce robust data governance, and optimise operational costs within our Microsoft Azure environment.Your work will be centred on proactive system monitoring, error resolution, and continuous improvements, while mentoring and guiding colleagues.
What you will be doing
• Oversee and automate the operational processes that support data workflows developed by the Data Engineering team while ensuring seamless coordination with the DevOps group.
• Spearhead the development, integration, and maintenance of CI/CD data pipelines for automated deployments.
• Integrate best practices for monitoring and observability to proactively detect, analyse, and resolve issues.
• Enforce robust data governance and security protocols through tools like Azure Key Vault, ensuring compliance with standards such as GDPR, and other regulatory frameworks.
• Collaborate closely with Data Engineering, Data Science, Analytics, and DevOps teams to align operational strategies with technical and business requirements.
• Optimize operational performance and cost management for services including Azure Data Factory, Azure Databricks, Delta Lake, and Azure Data Lake Storage.
• Serve as the domain expert in DataOps by providing strategic guidance, mentoring colleagues, and driving continuous process improvements.
What you will need
• Demonstrable experience in DataOps, Data Engineering, DevOps, or related roles focused on managing data operations in complex, data-centric environments.
• Proven experience working with agile teams and driving automation of data workflows within the Microsoft Azure ecosystem.
• Hands-on expertise with Azure Data Platform with components such as Azure Data Factory, Azure Databricks, Azure Data Lake Storage, Delta Lake, Azure SQL, Purview and APIM.
• Proficiency in developing CI/CD data pipelines and strong programming skills in Python, SQL, Bash, and PySpark for automation.
• Strong aptitude for data pipeline monitoring and an understanding of data security practices such as RBAC and encryption.
• Implemented data and pipeline observability dashboards, ensuring high data quality, and improving the efficiency of data workflows.
• Experience ensuring compliance with regulatory frameworks and implementing robust data governance measures.
• Demonstrated ability to implement Infrastructure as Code using Terraform, to provision and manage data pipelines and associated resources.
What you will get in return In return, you'll receive a competitive salary of £55,280 - £62,190, plus the opportunity to take an 8% cash benefit uplift. This is an 18-month fixed-term contract with strong potential to become permanent. You'll also benefit from an excellent package that includes private medical insurance and a generous pension scheme, making this a highly attractive opportunity.
What you need to do now
If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us now.
If this job isn't quite right for you, but you are looking for a new position, please contact us for a confidential discussion about your career.
Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found at hays.co.uk