Brio Digital

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with NHS or Central Government experience, offering a £500/day rate inside IR35, hybrid work in Leeds until March 2026. Key skills include Python, AWS, Databricks, SQL, and data governance expertise.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
500
-
🗓️ - Date
November 2, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Inside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
Leeds, England, United Kingdom
-
🧠 - Skills detailed
#SQL (Structured Query Language) #Scala #Data Pipeline #Agile #Data Architecture #"ETL (Extract #Transform #Load)" #Lambda (AWS Lambda) #NoSQL #AWS (Amazon Web Services) #Databricks #Data Engineering #Databases #S3 (Amazon Simple Storage Service) #Data Quality #Data Governance #Python #Cloud #MongoDB #Redshift #Version Control #Leadership
Role description
Data Engineer – (AWS / Databricks / Python / SQL) Job Title: Data Engineer Location: 1–2x/week on-site in Leeds Duration: Until March 2026 Day Rate: £500/day Inside IR35 🚨 NHS or Central Government experience is ESSENTIAL 🚨 Brio Digital are supporting a HealthTech consultancy delivering a major transformation across NHS data platforms. We are seeking a senior candidate with strong expertise in Python and AWS Serverless architecture (including Lambda functions). The ideal individual will not only bring technical excellence but also act as a unifying presence, fostering collaboration, mentoring others, and maintaining harmony within a high-performing data team. What You’ll Do • Design and maintain end-to-end data pipelines within the AWS ecosystem • Build reliable data solutions using Databricks, Python, and SQL • Integrate and optimise data from diverse sources, including MongoDB • Develop and maintain ETL processes for analytical and operational use • Implement data quality, governance, and testing frameworks • Collaborate with analysts, scientists, and delivery teams to ensure accessible, trusted data • Contribute to the evolution of scalable, cloud-native data architectures • Foster team alignment and positive collaboration across technical and non-technical stakeholders What You’ll Bring • Strong hands-on experience with Python for ETL and data engineering • Proven expertise in AWS Data Services (Glue, Redshift, S3, Lambda, Databricks) • Advanced SQL skills across data modelling, optimisation, and stored procedures • Familiarity with MongoDB or other NoSQL databases • Understanding of modern data architecture and data governance principles • Experience with CI/CD, version control, and Agile project delivery • Excellent communication and team leadership skills