Methods

(SC Cleared) Data Engineer - MS Fabric

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an SC Cleared Data Engineer specializing in MS Fabric, offering £500 to £550 per day, remote UK-based work, with an initial contract until March 2026. Key skills include Python, SQL, ETL/ELT pipelines, and data modeling.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
550
-
🗓️ - Date
January 10, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Inside IR35
-
🔒 - Security
Yes
-
📍 - Location detailed
United Kingdom
-
🧠 - Skills detailed
#AI (Artificial Intelligence) #REST (Representational State Transfer) #Data Ingestion #Cloud #Data Lakehouse #PySpark #Python #Data Lake #Data Pipeline #Version Control #Pandas #Azure #Data Engineering #Snowflake #Spark (Apache Spark) #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Automation #Scala #GIT #REST API
Role description
Daily rate: £500 to £550 inside IR35 Must be SC Cleared Location: Remote - UK based only Duration: initial contract until end for March 2026 Experience: • Strong hands-on experience with Microsoft Fabric (spark notebooks, pipelines, data flows). • Proficient in Python for data engineering (eg, Pandas, PySpark, asyncio, automation scripts). • Strong SQL skills (T-SQL or similar) for transforming and modelling data. • Experience building scalable ETL/ELT pipelines using cloud technologies. • Good understanding of data modelling (star/snowflake), data warehousing, and modern data lakehouse principles. • Familiarity with version control (Git) and CI/CD pipelines • The main skillset is Data engineering - specifically experience in ingesting and maintaining data pipelines. • Knowledge of REST APIs, SFTP data ingestion is useful. • PowerBI for dashboarding • Some knowledge of LLMs would be useful - especially Azure AI