Nuclear Waste Services

Senior Azure Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Azure Data Engineer on a 12-month contract, offering competitive pay. Key skills required include Azure Data Factory, SQL, Python, and CI/CD with Azure DevOps. Experience in data governance and cloud-native architectures is essential.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 9, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Unknown
-
📄 - Contract
Fixed Term
-
🔒 - Security
Unknown
-
📍 - Location detailed
South East, England, United Kingdom
-
🧠 - Skills detailed
#Azure ADLS (Azure Data Lake Storage) #Microsoft Power BI #Scala #Data Lake #Data Quality #Azure SQL #GIT #Apache Spark #DevOps #Data Engineering #BI (Business Intelligence) #Documentation #Spark (Apache Spark) #Deployment #Azure #Azure DevOps #Infrastructure as Code (IaC) #Automation #Compliance #Classification #AI (Artificial Intelligence) #Data Architecture #Monitoring #Python #Azure Data Factory #Metadata #Cloud #Databricks #Scripting #Agile #Security #ADF (Azure Data Factory) #Data Storage #GDPR (General Data Protection Regulation) #Data Management #SQL (Structured Query Language) #ADLS (Azure Data Lake Storage) #Data Pipeline #Strategy #YAML (YAML Ain't Markup Language) #DAX #Data Processing #Data Lakehouse #Storage #Data Visualisation #Synapse #Data Governance #Physical Data Model
Role description
We are seeking a Senior Azure Data Engineer to join our IT team on a 12 month contract! About the Role As a Senior Azure Data Engineer, you will lead the design, development, and optimisation of data flows and pipelines across the NWS enterprise data platform. Playing a pivotal role in shaping the NWS data engineering strategy, ensuring secure, scalable, and efficient data solutions that support business intelligence, analytics, and innovation. Responsibilities • Lead the design and implementation of data pipelines using Azure Data Factory, Synapse Analytics, Databricks, and Azure SQL. • Define and maintain data architecture standards, ensuring alignment with enterprise architecture and security policies. • Champion best practices in data modelling, data quality, and metadata management. • Build and optimise scalable data lakehouse solutions using Azure Data Lake Storage Gen2. • Implement CI/CD pipelines and Infrastructure as Code (IaC) using Azure DevOps, Git • Ensure solutions are secure-by-design and private-by-design, in line with ISO27001, UK GDPR, and Microsoft Cloud Security Benchmarks. • Oversee the monitoring, performance tuning, and troubleshooting of data pipelines and services. • Collaborate with internal Cloud Operations and Information Security teams to ensure operational resilience and compliance. • Develop and maintain comprehensive documentation for data pipelines, architecture, and governance processes. • Support the development of agile assurance and governance frameworks to accelerate delivery. • Act as an Intelligent Client to external suppliers and delivery partners. • Communicate effectively with technical and non-technical stakeholders to translate business needs into data solutions. • Mentor and support junior engineers and colleagues across IT, Digital, and Information Security. Qualifications Essential • Data development process: Highly skilled in designing, building, and maintaining robust data pipelines and architectures. • Data modelling: Highly skilled in designing conceptual, logical, and physical data models for analytics and operational use. • Coding and scripting: Highly skilled in SQL, Python, Apache Spark, YAML, and DAX; experience with Spark and distributed data processing. • CI/CD and DevOps: Strong experience with Azure DevOps, Git, and automated deployment pipelines. • Cloud engineering: Deep understanding of Azure data services and cloud-native architectures. Desirable • Data governance and privacy: Familiarity with Microsoft Purview and data classification frameworks. • Data visualisation: Experience with Power BI and integration with M365. • Data storage: Experience working with Microsoft Fabric. • AI and automation: Exposure to AI tools such as Microsoft Copilot and automation of data workflows. Pay range and compensation package This role will be onboarded and payrolled via our agency provider Alexander Mann Solutions. Recruitment will be conducted via the NWS Talent Acquisition team but once the successful candidate has been identified, the agency provider will be in touch to commence onboarding and agree next steps. There are two payroll options via our agency provider which is PAYE (Pay as you Earn) or Umbrella Company.