Impellam Group

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of "unknown," offering a pay rate of "unknown," based in London. Key skills include Scala/Java, Quantexa experience, and financial services industry experience. Hybrid work model: 1-2 days onsite.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 10, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Inside IR35
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Data Quality #Datasets #Data Engineering #Data Processing #Data Pipeline #Data Architecture #Documentation #Data Wrangling #HBase #Data Ingestion #Data Science #Scala #Java
Role description
Lorien's leading banking client is looking for a Data Engineer to join a newly built team on a new project. The Ideal Candidate will design, build, and maintain scalable data pipelines and analytical solutions with a strong emphasis on Scala-based data processing, graph analytics, advanced data wrangling, and experienced in using and understanding Quantexa. You will work closely with data scientists, analysts, and platform engineers to enable insights from complex, interconnected datasets. This role is based in London. This role will be Via Umbrella. Working in a Hybrid Model of 1-2 days a week on site. Key Skills and Experience • Experienced in Design, Develop, And Maintain Scalable Data Pipelines Using Scala/Java And Modern Distributed Data Processing Frameworks. • Build And Optimize Graph-Based Data Models to Analyse Relationships, Networks, And Dependencies Across Large Datasets. • Experienced with using and working with Quantexa is highly advantageous. • Implement Graph Analytics Algorithms (E.G. Centrality, Community Detection, Path Analysis) To Support Advanced Analytical Use Cases. • Perform Complex Data Wrangling and Transformation on Structured and Semi-Structured Data. • Develop And Maintain Scripts (E.G. Scala) To Automate Data Ingestion, Validation, And Processing Tasks. • Ensure Data Quality, Reliability, And Performance Across Data Workflows. • Collaborate With Stakeholders To Translate Business Requirements Into Robust Data Solutions. • Contribute To Data Architecture, Best Practices, And Documentation. • Monitor And Troubleshoot Data Pipelines In Production Environments. • Experience of working within Financial Services Environments. IND\_PC3