

Opus Recruitment Solutions
Contract Lead Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Contract Lead Data Engineer in London or Birmingham, lasting 12 months with a pay rate of £500pd - £600pd. Key skills include Java, AWS, Apache Spark, and Snowflake, with experience in data engineering and workflow upgrades required.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
500
-
🗓️ - Date
February 11, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Outside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#AWS (Amazon Web Services) #Data Pipeline #AWS Glue #Java #Spark (Apache Spark) #Data Engineering #Apache Spark #Snowflake
Role description
Contract Lead Data Engineers
📍 Location: London or Birmingham
📆 Duration: 12-Month Contract
✅ IR35: Outside IR35
Rate: £500pd - £600pd
Opus is partnered with financial services client to deliver a major programme of work.
You’ll join an established engineering group, working alongside internal teams to build a new reporting workflow, upgrade an existing pipeline, and lead a full data‑sourcing uplift across multiple reporting workflows. The team will also be responsible for helping upgrade the framework for two critical internal workflows.
Tech Stack
• Java
• AWS (incl. AWS Glue)
• Apache Spark
• Snowflake
Required Experience
• Strong background in data engineering within distributed data environments
• Hands-on expertise with AWS, Spark, Glue, and Snowflake
• Experience building and optimising data pipelines & reporting workflows
• Ability to work closely with internal engineering and controls teams
• Experience upgrading or modernising existing workflows and frameworks
• For Lead-level: prior experience leading engineering teams or workstreams
📩 Interested?
Click apply or drop me an email at matthew.carter@opusrs.com
Contract Lead Data Engineers
📍 Location: London or Birmingham
📆 Duration: 12-Month Contract
✅ IR35: Outside IR35
Rate: £500pd - £600pd
Opus is partnered with financial services client to deliver a major programme of work.
You’ll join an established engineering group, working alongside internal teams to build a new reporting workflow, upgrade an existing pipeline, and lead a full data‑sourcing uplift across multiple reporting workflows. The team will also be responsible for helping upgrade the framework for two critical internal workflows.
Tech Stack
• Java
• AWS (incl. AWS Glue)
• Apache Spark
• Snowflake
Required Experience
• Strong background in data engineering within distributed data environments
• Hands-on expertise with AWS, Spark, Glue, and Snowflake
• Experience building and optimising data pipelines & reporting workflows
• Ability to work closely with internal engineering and controls teams
• Experience upgrading or modernising existing workflows and frameworks
• For Lead-level: prior experience leading engineering teams or workstreams
📩 Interested?
Click apply or drop me an email at matthew.carter@opusrs.com






