

Mentmore
D365 Data Engineer (Contract)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a D365 Data Engineer on a contract basis for "6 months" at a pay rate of "£XX/hour". Remote work is available. Requires 3-5 years of ERP data migration experience, proficiency in Azure Data Factory, Azure Databricks, SQL, and Python.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
650
-
🗓️ - Date
January 30, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#Azure Data Factory #Azure #Data Architecture #Data Extraction #Data Conversion #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Data Cleansing #Data Mapping #Microsoft Power BI #Python #Data Migration #BI (Business Intelligence) #MS D365 (Microsoft Dynamics 365) #Data Engineering #Data Accuracy #Azure Databricks #Migration #Databricks #Business Analysis #Data Quality #Scripting #Documentation #Data Management #ADF (Azure Data Factory) #Data Profiling
Role description
ROLE OVERVIEW
We are recruiting for a Data Engineer to join a data migration workstream (Dynamics program), which is rolling out D365 F&O to various UK businesses.
The data engineer role will work with the data team, D365 program team to execute the end-to-end data migration process.
This is an integral part of the data team, which consists of other data engineers, a data quality / BI consultant, and data consultants responsible for loading data to Dynamics.
The Program is rolling out D365 to multiple business units in parallel; as such, the Data Engineer may pivot across several concurrent data migrations efforts in different stages of their lifecycle.
You will be responsible for implementing the code and executing the process to extract, transform, and load data from source systems (both D365 and others like Access Dimensions) to D365; leveraging and, where possible, refining DM processes and tools; reporting on status and blockers to the Data Lead and project managers; collaborating with colleagues in the project team (functional, technical, change/training).
KEY TASKS & RESPONSIBILITIES
Data Migration Process and Tooling
• Work with Azure Data Factory (ADF) extraction and orchestration pipelines, making enhancements where required, and integrating outputs with Azure Databricks for downstream transformation
• Develop and maintain the data migration pipeline in Azure Databricks using Python/SQL
• Work with the Data Lead and D365 program team to iteratively review and refine the data migration process and pipeline for efficiency, auditability, and data accuracy
• Work with the Data Reporting Analyst(s) to support front-end (Power BI) use cases supported by the pipeline, including for day-to-day data quality, data migration status reporting, data migration analytics, etc.
Data Discovery, Assessment, and Profiling
• Analyze system documentation (e.g., system design documents, FDDs) to understand data architecture in the core model; interface directly with systems implementation partner to clarify understanding
• Participate in data discovery for new source systems to document source data architecture and design considerations relevant to data mapping and transformation (especially for known non-D365 source systems, e.g., AccessDimensions, Concept Evolution)
• Work directly with Technical workstream to understand system architecture and integration requirements
• Conduct thorough data profiling and quality assessment across source systems
• Analyze and provide reporting on data volume and complexity to inform planning
• Coordinate with the systems integrator to understand master data needed to support transactional data migration
• Collaborate with business SMEs to validate data definitions
Data Extraction
• Work with business SMEs, systems integrator, and PMO to document, validate, and support data extraction scope criteria / business rules for master and transactional data
• Work with Business Analysis team to confirm master and transactional data requirements and validation approaches; confirm alignment between transactional and master data scope
• Implement data extraction scope criteria within ADF and Databricks extraction logic as appropriate
Source Data Quality and Cleansing
• Identify data quality issues with extracted data and develop remediation strategies (e.g., cleanse in source)
• Work with business SMEs to implement data cleansing requirements and thresholds / KPIs
• Work directly with business SMEs to conduct iterative cycles of data cleansing, re-reporting on data quality, and further cleansing activities until thresholds / KPIs are met or exceeded
Data Conversion (Mapping and Transformation)
• Work directly with business SMEs and systems integrator to:
• Implement and document source-to-target field mappings
• Define validation rules
• Define transformation logic
• Work directly with business SMEs to programmatically and, where required, assist with populating / refining data
• Implement mappings and transformations in Python/SQL code in Databricks
Data Entity & Framework Configuration
• Work with the system integrator / D365 platform team to support where needed, the configuration / update of Data Management Framework (DMF) entities within D365
• Optimize performance for large dataset processing
• Work with the system integrator / D365 platform team to support where needed, the creation of composite / custom data entities for complex migration scenarios as required
Data Migration Dry-Run and Cutover Management
• Utilize and refine data migration run-books during dry-runs and go-live cutover
• Execute trial migrations and iterative validation cycles, including:
• Adhering to and supporting the planning and scheduling of dry-run cycles with the Data Lead and PMO
• Coordinating with business users to conduct dry-run data validation activities
• Ensuring appropriate documentation for audit purposes
• Working with Testing workstream to provide test data
• Support systems integrator / business SMEs with data required to support transactional data migration
• Execute production data migration during go-live activities
QUALIFICATIONS, SKILLS & EXPERIENCE
• 3–5 years’ experience as a Data Engineer supporting ERP data migration or large-scale enterprise data programmes
• Proven experience across multiple end-to-end delivery cycles (dry runs through cutover)
• Experience with Microsoft Dynamics 365 Finance & Operations
• Strong experience with Azure Data Factory (ADF) for extraction and orchestration
• Strong experience with Azure Databricks for data transformation and staging
• Deep knowledge of data migration toolsets and scripting languages (e.g. SQL, Python)
• Familiarity with integration platforms and APIs to connect ERP with other enterprise systems
• Familiarity with ERP implementation methodologies
• Power BI development experience
ROLE OVERVIEW
We are recruiting for a Data Engineer to join a data migration workstream (Dynamics program), which is rolling out D365 F&O to various UK businesses.
The data engineer role will work with the data team, D365 program team to execute the end-to-end data migration process.
This is an integral part of the data team, which consists of other data engineers, a data quality / BI consultant, and data consultants responsible for loading data to Dynamics.
The Program is rolling out D365 to multiple business units in parallel; as such, the Data Engineer may pivot across several concurrent data migrations efforts in different stages of their lifecycle.
You will be responsible for implementing the code and executing the process to extract, transform, and load data from source systems (both D365 and others like Access Dimensions) to D365; leveraging and, where possible, refining DM processes and tools; reporting on status and blockers to the Data Lead and project managers; collaborating with colleagues in the project team (functional, technical, change/training).
KEY TASKS & RESPONSIBILITIES
Data Migration Process and Tooling
• Work with Azure Data Factory (ADF) extraction and orchestration pipelines, making enhancements where required, and integrating outputs with Azure Databricks for downstream transformation
• Develop and maintain the data migration pipeline in Azure Databricks using Python/SQL
• Work with the Data Lead and D365 program team to iteratively review and refine the data migration process and pipeline for efficiency, auditability, and data accuracy
• Work with the Data Reporting Analyst(s) to support front-end (Power BI) use cases supported by the pipeline, including for day-to-day data quality, data migration status reporting, data migration analytics, etc.
Data Discovery, Assessment, and Profiling
• Analyze system documentation (e.g., system design documents, FDDs) to understand data architecture in the core model; interface directly with systems implementation partner to clarify understanding
• Participate in data discovery for new source systems to document source data architecture and design considerations relevant to data mapping and transformation (especially for known non-D365 source systems, e.g., AccessDimensions, Concept Evolution)
• Work directly with Technical workstream to understand system architecture and integration requirements
• Conduct thorough data profiling and quality assessment across source systems
• Analyze and provide reporting on data volume and complexity to inform planning
• Coordinate with the systems integrator to understand master data needed to support transactional data migration
• Collaborate with business SMEs to validate data definitions
Data Extraction
• Work with business SMEs, systems integrator, and PMO to document, validate, and support data extraction scope criteria / business rules for master and transactional data
• Work with Business Analysis team to confirm master and transactional data requirements and validation approaches; confirm alignment between transactional and master data scope
• Implement data extraction scope criteria within ADF and Databricks extraction logic as appropriate
Source Data Quality and Cleansing
• Identify data quality issues with extracted data and develop remediation strategies (e.g., cleanse in source)
• Work with business SMEs to implement data cleansing requirements and thresholds / KPIs
• Work directly with business SMEs to conduct iterative cycles of data cleansing, re-reporting on data quality, and further cleansing activities until thresholds / KPIs are met or exceeded
Data Conversion (Mapping and Transformation)
• Work directly with business SMEs and systems integrator to:
• Implement and document source-to-target field mappings
• Define validation rules
• Define transformation logic
• Work directly with business SMEs to programmatically and, where required, assist with populating / refining data
• Implement mappings and transformations in Python/SQL code in Databricks
Data Entity & Framework Configuration
• Work with the system integrator / D365 platform team to support where needed, the configuration / update of Data Management Framework (DMF) entities within D365
• Optimize performance for large dataset processing
• Work with the system integrator / D365 platform team to support where needed, the creation of composite / custom data entities for complex migration scenarios as required
Data Migration Dry-Run and Cutover Management
• Utilize and refine data migration run-books during dry-runs and go-live cutover
• Execute trial migrations and iterative validation cycles, including:
• Adhering to and supporting the planning and scheduling of dry-run cycles with the Data Lead and PMO
• Coordinating with business users to conduct dry-run data validation activities
• Ensuring appropriate documentation for audit purposes
• Working with Testing workstream to provide test data
• Support systems integrator / business SMEs with data required to support transactional data migration
• Execute production data migration during go-live activities
QUALIFICATIONS, SKILLS & EXPERIENCE
• 3–5 years’ experience as a Data Engineer supporting ERP data migration or large-scale enterprise data programmes
• Proven experience across multiple end-to-end delivery cycles (dry runs through cutover)
• Experience with Microsoft Dynamics 365 Finance & Operations
• Strong experience with Azure Data Factory (ADF) for extraction and orchestration
• Strong experience with Azure Databricks for data transformation and staging
• Deep knowledge of data migration toolsets and scripting languages (e.g. SQL, Python)
• Familiarity with integration platforms and APIs to connect ERP with other enterprise systems
• Familiarity with ERP implementation methodologies
• Power BI development experience






