

Sanderson
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snr Data Engineer with SC Clearance, offering £500/d - £525/d for a 6-month contract based in London (3 days/week). Key skills include Azure, ADF, PySpark, Scala, and experience with large data sets and data pipeline optimization.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
December 4, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Inside IR35
-
🔒 - Security
Yes
-
📍 - Location detailed
London
-
🧠 - Skills detailed
#Scala #Data Pipeline #"ETL (Extract #Transform #Load)" #Spark (Apache Spark) #PySpark #ADF (Azure Data Factory) #Base #Security #Azure #Data Engineering #Data Architecture #Data Bricks
Role description
Role title: Snr Data Engineer
Security Clearance required: SC Clearance
IR35 Status: Inside IR35
Pay Rates: £500/d - £525/d
Contract length: 6 months
Base Location: London - 3/d a week
Skills required:
- Azure / Azure Data bricks
- ADF / ETL
- Pyspark / Scala
Experience:
Experience working on large data sets, and complex data pipelines. Understanding of Data Architecture and Design, and Data pipeline optimisation.
Proven expertise with Data bricks, including hands on implementation experience and certifications.
Reasonable Adjustments:
Respect and equality are core values to us. We are proud of the diverse and inclusive community we have built, and we welcome applications from people of all backgrounds and perspectives. Our success is driven by our people, united by the spirit of partnership to deliver the best resourcing solutions for our clients.
If you need any help or adjustments during the recruitment process for any reason, please let us know when you apply or talk to the recruiters directly so we can support you.
Role title: Snr Data Engineer
Security Clearance required: SC Clearance
IR35 Status: Inside IR35
Pay Rates: £500/d - £525/d
Contract length: 6 months
Base Location: London - 3/d a week
Skills required:
- Azure / Azure Data bricks
- ADF / ETL
- Pyspark / Scala
Experience:
Experience working on large data sets, and complex data pipelines. Understanding of Data Architecture and Design, and Data pipeline optimisation.
Proven expertise with Data bricks, including hands on implementation experience and certifications.
Reasonable Adjustments:
Respect and equality are core values to us. We are proud of the diverse and inclusive community we have built, and we welcome applications from people of all backgrounds and perspectives. Our success is driven by our people, united by the spirit of partnership to deliver the best resourcing solutions for our clients.
If you need any help or adjustments during the recruitment process for any reason, please let us know when you apply or talk to the recruiters directly so we can support you.






