Harnham

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of "unknown," offering £550 per day outside IR35. Located in London, it requires strong Azure Databricks, SQL Server, and dbt skills, along with experience in legacy BI environments and collaborative project work.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 28, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Outside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
London
-
🧠 - Skills detailed
#Azure Databricks #Data Engineering #Cloud #Migration #Data Warehouse #Datasets #SQL (Structured Query Language) #Tableau #SQL Server #dbt (data build tool) #Scala #SSRS (SQL Server Reporting Services) #Spark (Apache Spark) #"ETL (Extract #Transform #Load)" #BI (Business Intelligence) #PySpark #Azure #Databricks
Role description
Data Engineer London, £550 per day Outside IR35, HybridThis is an exciting opportunity to play a key role in a major data modernisation programme, focused on migrating a large SQL Server estate into a cloud-native Azure Databricks environment. You will be central to transforming legacy reporting and data logic into scalable, modernised pipelines and models, helping the business unlock faster, more reliable insights.The Company They are a well-established organisation undergoing a significant transformation of their data landscape. With a strong commitment to modern BI practices and cloud engineering, they are investing in next-generation technology to improve analytics capabilities across the business. You will join a collaborative environment where engineering excellence, trusted data, and high-quality reporting are core priorities.The Role and Deliverables • Lead the migration of SQL Server stored procedures, functions, views, and legacy reporting logic into Azure Databricks. • Reengineer and optimise SQL workloads for Databricks using Databricks SQL, dbt, and PySpark. • Support the uplift of SSRS and Tableau reporting so that all outputs are powered by Databricks-based datasets. • Validate migrated datasets and reporting outputs, ensuring high levels of accuracy and performance. • Document pipelines, models, and migration processes for long-term maintainability. • Collaborate with BI, data warehouse, and project teams to ensure smooth delivery across the programme.Your Skills and Experience • Strong experience working with Azure Databricks, including SQL development, data modelling, and PySpark. • Proven capability in SQL Server, including complex T-SQL logic, stored procedures, and performance optimisation. • Hands-on experience with dbt for modular, testable data model development. • Solid understanding of legacy BI environments, particularly SSRS. • Knowledge of Tableau and how to optimise dashboards against cloud-based data sources. • Ability to work collaboratively within a BI, data warehouse, or reporting team during large-scale migrations.How to Apply If this project aligns with your experience, please apply with your most recent CV.