

SF Recruitment
Lead Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead Data Engineer on a contract basis, focusing on designing an Azure data platform. Requires 3+ years in data architecture, strong Azure skills, and experience with ETL/ELT pipelines. Pay rate is "unknown" and location is "remote."
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 16, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Sutton Coldfield
-
🧠 - Skills detailed
#Data Lake #Migration #Azure Active Directory #.Net #REST API #Scala #Strategy #Data Management #Data Integration #Deployment #Metadata #Cloud #Data Pipeline #Microsoft Power BI #REST (Representational State Transfer) #BI (Business Intelligence) #CRM (Customer Relationship Management) #Batch #Databricks #Azure Data Factory #ODBC (Open Database Connectivity) #Data Engineering #"ETL (Extract #Transform #Load)" #Security #Azure #SQL (Structured Query Language) #Documentation #API (Application Programming Interface) #Data Architecture #Data Security #ADF (Azure Data Factory) #Data Governance
Role description
About the Role
We’re working with a market-leading organisation that’s undergoing a major transformation, moving from manual, Excel-based reporting to a fully automated, intelligence-driven data ecosystem.
As Data Architect, you’ll be responsible for designing and implementing the Azure-based data platform that becomes the single source of truth across the business. This is a hands-on, strategic role where you’ll build scalable, governed data architecture and shape how data is used across Finance, Operations, and Commercial functions.
What You’ll Be Doing
Data Architecture & Platform Design
• Design and implement an enterprise data lake on Azure Data Lake Gen2, using Bronze/Silver/Gold architecture.
• Build and maintain scalable ETL/ELT pipelines in Azure Data Factory to integrate data from core systems (AS400, Tagetik, CRM, Esker, Slimstock).
• Develop the overall data model, data dictionaries, and lineage documentation.
• Deliver a stable “batch-first” integration strategy with AS400 during its .NET migration, with a roadmap toward API integration.
Data Governance & Quality
• Implement the technical foundation for data governance – quality checks, metadata management, and master data validation.
• Embed business rules and validation logic directly within data pipelines.
• Define and manage data security and access controls (Azure and Power BI row-level security).
Implementation & Optimisation
• Lead the hands-on build, testing, and deployment of the Azure data platform.
• Monitor platform performance and optimise pipelines for cost, scalability, and speed.
• Define and document technical standards and best practices.
• Oversee the migration from legacy tools (Domo, Vecta) to the new Power BI ecosystem.
What You’ll Bring
Technical Skills
• Strong hands-on experience with Azure Data Lake Gen2, Azure Data Factory, and Azure Active Directory.
• Advanced skills in data modelling (conceptual, logical, physical) and SQL for complex transformations.
• Proven ability to design and build high-performance ETL/ELT pipelines.
• Understanding of data governance, security, and access control frameworks.
• Knowledge of batch and real-time data integration and experience with ODBC connectors or REST APIs.
• Familiarity with Databricks and/or Microsoft Fabric is a bonus.
Experience
• 3+ years in a Data Architect or senior data engineering role.
• Proven record of designing and delivering cloud-based data platforms, ideally in Azure.
• Background working with complex ERP or transactional systems.
• Experience supporting or leading data transformation initiatives within a business setting.
About the Role
We’re working with a market-leading organisation that’s undergoing a major transformation, moving from manual, Excel-based reporting to a fully automated, intelligence-driven data ecosystem.
As Data Architect, you’ll be responsible for designing and implementing the Azure-based data platform that becomes the single source of truth across the business. This is a hands-on, strategic role where you’ll build scalable, governed data architecture and shape how data is used across Finance, Operations, and Commercial functions.
What You’ll Be Doing
Data Architecture & Platform Design
• Design and implement an enterprise data lake on Azure Data Lake Gen2, using Bronze/Silver/Gold architecture.
• Build and maintain scalable ETL/ELT pipelines in Azure Data Factory to integrate data from core systems (AS400, Tagetik, CRM, Esker, Slimstock).
• Develop the overall data model, data dictionaries, and lineage documentation.
• Deliver a stable “batch-first” integration strategy with AS400 during its .NET migration, with a roadmap toward API integration.
Data Governance & Quality
• Implement the technical foundation for data governance – quality checks, metadata management, and master data validation.
• Embed business rules and validation logic directly within data pipelines.
• Define and manage data security and access controls (Azure and Power BI row-level security).
Implementation & Optimisation
• Lead the hands-on build, testing, and deployment of the Azure data platform.
• Monitor platform performance and optimise pipelines for cost, scalability, and speed.
• Define and document technical standards and best practices.
• Oversee the migration from legacy tools (Domo, Vecta) to the new Power BI ecosystem.
What You’ll Bring
Technical Skills
• Strong hands-on experience with Azure Data Lake Gen2, Azure Data Factory, and Azure Active Directory.
• Advanced skills in data modelling (conceptual, logical, physical) and SQL for complex transformations.
• Proven ability to design and build high-performance ETL/ELT pipelines.
• Understanding of data governance, security, and access control frameworks.
• Knowledge of batch and real-time data integration and experience with ODBC connectors or REST APIs.
• Familiarity with Databricks and/or Microsoft Fabric is a bonus.
Experience
• 3+ years in a Data Architect or senior data engineering role.
• Proven record of designing and delivering cloud-based data platforms, ideally in Azure.
• Background working with complex ERP or transactional systems.
• Experience supporting or leading data transformation initiatives within a business setting.