Senior Data Engineer/Architect (Contract)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer/Architect (Contract) in the West Country, hybrid with 3 days remote work weekly. Key skills include hands-on Databricks experience, Azure Cloud knowledge, SQL, Python, and ETL design.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
-
🗓️ - Date discovered
August 28, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Hybrid
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Wiltshire, England, United Kingdom
-
🧠 - Skills detailed
#DevOps #Terraform #SQL Queries #Data Governance #Data Quality #Azure #Data Lake #SQL (Structured Query Language) #Migration #Synapse #Data Management #"ETL (Extract #Transform #Load)" #Spark (Apache Spark) #Azure cloud #Leadership #PySpark #Databricks #Cloud #Data Engineering #Matillion #Infrastructure as Code (IaC) #Python
Role description
West Country - Hybrid - 3 Days Work from home per week Shape a greenfield data platform from day one, setting the standards for architecture, engineering and governance that will guide a major financial services organisation for years to come. You’ll define how data is designed, delivered and used—while steering the migration to Databricks and mentoring teams to raise their technical game. This is a rare chance to take real ownership: you’ll make hands-on technical decisions, assure third-party delivery, and establish best practices that become the blueprint for a large-scale transformation. The impact of your work will be felt not just across engineering but at board level—unlocking data to drive smarter decisions across the business. What you'll do This is a large-scale data transformation. You’ll own the migration of the existing data lake into Databricks while helping them develop new capabilities in data governance, data quality and data management. Right now, most delivery is handled by external suppliers. But they need someone internal to assure quality, raise in-house skills and provide strong technical leadership across projects. That’s where you come in. You’ll be the technical lead with deep engineering knowledge and practical architecture skills. You'll take ownership of the platform migration and assurance, leading the migration of the existing data lake into Databricks. You'll assure engineering quality from third-party suppliers and contribute hands-on to coding, pipeline development and modelling of core data entities. You'll also support configuration, testing frameworks, and knowledge transfer into their architecture and engineering teams with implementation-driven design. In addition, you'll support the existing data engineering team with business as usual tasks (SQL queries, Synapse analytics, pipeline maintenance) that include introducing best practices . You’ll also be mentoring team skills in coding, testing, and DevOps and collaborate with the Software Delivery Manager and data engineering leadership. What you'll need • Hands-on Databricks experience • Strong Azure Cloud knowledge • Proficient in SQL, Python, PySpark • ETL & pipeline design (Matillion preferred, alternatives acceptable) • Practical data modelling & pipeline architecture • Terraform or Bicep for IaC About the company The company is one of the longest-established financial advice networks in the UK and have helped over two generations of people to make good financial decisions. Please click the ‘Apply’ button. Don’t worry if your CV isn’t up to date. Just send what you have and we’ll deal with that later.