

Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer specializing in Azure Databricks, offering a 12-month contract at £500 - £650 per day. Remote work is available with occasional Bristol office visits. Candidates must have SC Clearance eligibility and strong Python skills.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
650
-
🗓️ - Date discovered
July 1, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Hybrid
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Yes
-
📍 - Location detailed
United Kingdom
-
🧠 - Skills detailed
#Databricks #Python #Programming #Spark (Apache Spark) #Automation #Data Ingestion #PySpark #Azure Databricks #Deployment #"ETL (Extract #Transform #Load)" #Data Engineering #Scala #Azure #Monitoring #AI (Artificial Intelligence) #Data Architecture #Data Pipeline
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Lead Data Engineer – Azure Databricks
Remote – 2 Days a Month in Bristol
Contract Opportunity - £500 - £650 a day DOE
Applicants Must be Eligible for SC Clearance
About the Role
TRIA is proud to be partnering with a purpose-driven, mission-led organisation that is using data to make a meaningful impact. As they scale their data capabilities and embrace new technologies, they are seeking an experienced Principal Data Engineer to lead the advancement of their Azure-based platform.
The Opportunity
You will work alongside a collaborative data team to enhance and maintain robust data ingestion pipelines, facilitate a transition to Azure and Databricks and help productionise AI models with monitoring and alerting frameworks in place. This is a key role in supporting the organisation’s continued data maturity.
Key Responsibilities
• Build and optimise scalable, reusable data pipelines
• Improve team efficiency through smart automation and streamlined processes
• Support the deployment and monitoring of AI models in production
• Contribute to the evolution of ETL processes across the data platform
Your Experience
• Proven experience with Azure Databricks and its ecosystem
• Strong Python programming skills, ideally using PySpark
• Analytical mindset and structured approach to solving complex problems
• Understanding of modern data architecture and engineering practices
Contract Details
• Initial 12-month contract with potential for extension
• Competitive day rate
• Remote working model with occasional travel to Bristol office
• Work with a values-driven organisation making a real difference
Next Steps
If you're excited by the opportunity to apply your expertise in a meaningful, tech-forward environment, please apply with your CV or reach out to TRIA for a confidential discussion.