Hirexa Solutions

Pentaho Consultant

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Pentaho Consultant in London, UK, offering a permanent position. Required skills include expertise in Pentaho Data Integration, advanced SQL, and familiarity with Java or Python. 6–12 years of experience in Data Integration is necessary, preferably in Banking/Financial Services. Certifications in Pentaho or cloud platforms are preferred.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
December 23, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
London Area, United Kingdom
-
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Schema Design #Automation #Data Profiling #Cloud #Spark (Apache Spark) #Database Systems #Scripting #Python #SQL (Structured Query Language) #Big Data #Data Analysis #Informatica #Hadoop #GCP (Google Cloud Platform) #Talend #Azure #Java #Data Manipulation #Data Integration #VBA (Visual Basic for Applications) #Data Quality #SSIS (SQL Server Integration Services) #AWS (Amazon Web Services)
Role description
Role title : Pentaho Consultant Location : London, UK Hybrid: 3 days to client office Employment Type : Permanent Required Skills Expertise in Pentaho Data Integration for ETL processes. Strong understanding of data warehousing concepts, data modelling, and schema design. Experience with other ETL tools (Informatica, Talend, SSIS) is a plus. Advanced SQL for querying and data manipulation. Familiarity with Java, Python, or VBA for scripting and automation. Knowledge of RESTful APIs and various database systems. Skills in data profiling, data analysis, and ensuring data quality. Experience 6–12 years in Data Integration and ETL development. Minimum 3–5 years of hands-on Pentaho experience. Leading complex ETL projects and working with cross-functional teams. Exposure to Banking/Financial Services domain is a plus. Preferred Qualifications Certifications in Pentaho, Big Data, or Cloud Platforms (AWS/GCP/Azure). Experience with Big Data technologies (Hadoop, Spark) and cloud data services