

Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with 5+ years of experience in cloud data engineering, specifically Azure and Databricks, for a 6-month remote contract (£550-£580/day). Key skills include SQL, data governance, and D365 F&O migration experience.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
580
-
🗓️ - Date discovered
June 14, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Remote
-
📄 - Contract type
Outside IR35
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
United Kingdom
-
🧠 - Skills detailed
#SQL (Structured Query Language) #Deployment #Data Pipeline #Version Control #Azure Blob Storage #ADF (Azure Data Factory) #Data Governance #Compliance #Python #Monitoring #dbt (data build tool) #MDM (Master Data Management) #Observability #Data Engineering #Security #Databricks #Cloud #Synapse #Data Transformations #Azure Databricks #Data Quality #Data Management #Azure #Storage #Data Migration #SharePoint #"ETL (Extract #Transform #Load)" #Datasets #Azure SQL Database #Scala #Azure SQL #DevOps #Documentation #Azure Data Factory #Migration #Data Ingestion #Data Security #Azure DevOps
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Azure/Databricks Data Engineer (D365 F&O Deployment)
£550 - £580 PER DAY
6 month contract
OUTSIDE IR35
REMOTE, but have to be UK based
Our client, a leader in their field, requires a talented Data Engineer, to join their Group technology team, responsible for the global intelligent data platform. You will be joining the business at a key moment in their evolution and will make a key and lasting impact on our technology organisation and landscape. Reporting to the Group Director of Data and Architecture with responsibility for the data workloads delivered against their Azure/Databricks platform.
You will be working within a major transformation programme migrating data to D365 F&O. You will be expected to have a proactive, hands-on approach.
You will be a key contributor to designing, developing, and managing data ingestion processes and transformation pipelines within Azure and Databricks environments.
The role involves utilizing Databrick's medallion architecture to create well-defined and governed data consumption models, adopting a data-as-a-product mindset and implementing key platform governance steps such as master data management and augmentation, governance, observability and exception/quality reporting
The ideal candidate will have experience in cloud data engineering, an understanding of Databricks, and a strong proficiency in Azure data services.
YOUR SKILLS:
• Minimum of 5 years of experience in data engineering with a focus on cloud technologies.
• Proven experience with Azure services (eg, Azure Data Factory, Azure Databricks, Azure SQL Database, Azure Synapse, Azure Blob Storage).
• Extensive experience with Databricks, including the development and management of data pipelines.
• Strong proficiency in SQL, reasonable Python skills.
• Experience with data governance, data quality, and data security best practices.
• Familiarity with data-as-a-product concepts and methodologies.
• Excellent problem-solving skills and ability to work in a fast-paced, dynamic environment.
• Strong communication and collaboration skills.
• Previous experience in a D365 Data Migration role, with experience in Finance & Operations.
• Extensive wider knowledge of the Microsoft D365 application stack.
• Proficiency in F&O standards documentation and best practices.
• Good understanding and solution design experience of technologies. This includes, but is not restricted to, solutions such as:
• Microsoft Dynamics Finance and Operations
• Microsoft Power Platform
• Microsoft Collaboration Platforms (Office365, SharePoint, Azure DevOps)
WHAT YOU WILL BE DOING:
• Design, implement, and maintain scalable and efficient data pipelines for ingestion, processing, and storage of global enterprise datasets using Azure and Databricks.
• Take end-to-end ownership of application ingestion workloads, ensuring all platform steps/runbooks are adopted.
• Utilize Databrick's medallion architecture (bronze, silver, gold layers) to ensure clean, reliable, and organized data flows.
• Ensure strict version control and reproducibility of data transformations using the DBT toolset.
• Develop and maintain ETL processes to transform raw data into structured data sets for analysis and consumption.
• Work within our data governance framework to implement our runbook to provide best practices to ensure data quality, accessibility, security, and compliance.
• Collaborate with identified data stewards to define productised data consumption. models/products and ensure workload datasets map to the target models.
• Ensure master data structures and reference data are correctly augmented to each workload.
• Optimize and troubleshoot data pipelines to ensure high performance and reliability.
• Use best practice implementations for observability, alerting and monitoring to evolve an effective data operation function.
• Align workloads to our master data management function to ensure data can be matched across applications/workloads.
If your profile matches the above, please send your CV for full details: