

Morgan McKinley
Data Engineer (Fabric)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (Fabric) with a 6-month contract, remote (UK-based), paying inside IR35. Key skills include Azure Data Engineering, Spark/PySpark, and experience with large datasets. SC Clearance is required, with active SC highly desirable.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 25, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Inside IR35
-
🔒 - Security
Yes
-
📍 - Location detailed
United Kingdom
-
🧠 - Skills detailed
#Data Quality #Azure Data Factory #Azure #ADLS (Azure Data Lake Storage) #Data Pipeline #PySpark #Synapse #DevOps #"ETL (Extract #Transform #Load)" #Data Layers #Data Engineering #Datasets #Spark (Apache Spark) #ADF (Azure Data Factory)
Role description
Data Engineer (Fabric)
Remote (UK-based)
6 months (Initial Contract)
Inside IR35
SC Clearance required (active SC highly desirable)
Overview
We are looking for multiple Data Engineers to support the build and delivery of a modern data platform using Microsoft Fabric and Azure technologies.
Responsibilities
• Build and maintain data pipelines using Azure Data Factory / Fabric pipelines
• Develop transformations using Spark / PySpark
• Implement Lakehouse structures within OneLake (Fabric)
• Work across Bronze, Silver, and Gold data layers
• Optimise performance and ensure data quality
• Collaborate with architects and governance teams
Requirements
• Strong experience with Azure Data Engineering (ADF, ADLS, Synapse or Fabric)
• Hands-on experience with Spark / PySpark
• Experience working with large-scale datasets
• Understanding of data modelling and pipeline optimisation
• Exposure to CI/CD and DevOps practices (desirable)
• SC Clearance required (active SC highly desirable)
Application Process
If you are interested in this opportunity, please submit your up-to-date CV along with your availability and expected day rate.
Suitable applicants will be contacted to discuss the role in more detail, including project scope, interview process, and next steps.
Data Engineer (Fabric)
Remote (UK-based)
6 months (Initial Contract)
Inside IR35
SC Clearance required (active SC highly desirable)
Overview
We are looking for multiple Data Engineers to support the build and delivery of a modern data platform using Microsoft Fabric and Azure technologies.
Responsibilities
• Build and maintain data pipelines using Azure Data Factory / Fabric pipelines
• Develop transformations using Spark / PySpark
• Implement Lakehouse structures within OneLake (Fabric)
• Work across Bronze, Silver, and Gold data layers
• Optimise performance and ensure data quality
• Collaborate with architects and governance teams
Requirements
• Strong experience with Azure Data Engineering (ADF, ADLS, Synapse or Fabric)
• Hands-on experience with Spark / PySpark
• Experience working with large-scale datasets
• Understanding of data modelling and pipeline optimisation
• Exposure to CI/CD and DevOps practices (desirable)
• SC Clearance required (active SC highly desirable)
Application Process
If you are interested in this opportunity, please submit your up-to-date CV along with your availability and expected day rate.
Suitable applicants will be contacted to discuss the role in more detail, including project scope, interview process, and next steps.




