

Whitehall Resources
ETL Data Engineer - SC Cleared
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an ETL Data Engineer - SC Cleared, requiring over 6 months of contract work. Key skills include expertise in Pentaho, Talend, Denodo, SAS, Agile methodologies, and DevOps practices. Security clearance is mandatory.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 12, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Unknown
-
📄 - Contract
Inside IR35
-
🔒 - Security
Yes
-
📍 - Location detailed
Shropshire, England, United Kingdom
-
🧠 - Skills detailed
#Virtualization #SQL (Structured Query Language) #DevOps #"ETL (Extract #Transform #Load)" #Security #Jenkins #Data Integration #SAS #Agile #Data Engineering #Talend #Kubernetes #Docker #Leadership #GIT #Scrum #BI (Business Intelligence) #Deployment #Scala #Automated Testing #Data Quality #Data Pipeline
Role description
ETL Data Engineer – SC Cleared
Whitehall Resources are currently looking for a ETL Data Engineer – SC Cleared.
You will be required to use an FCSA Accredited Umbrella Company for this role.
Must not have been outside of the UK for more than 6 Months in the last 5 years.
Key Requirements:
- Lead the design, development, and deployment of data integration and transformation solutions using Pentaho, Denodo, Talend, and SAS.
- Architect and implement scalable data pipelines and services that support business intelligence and analytics platforms.
- Collaborate with cross functional teams to gather requirements, define technical specifications, and deliver robust data solutions.
- Champion Agile and Scrum methodologies, ensuring timely delivery of sprints and continuous improvement.
- Drive DevOps practices for CI/CD, automated testing, and deployment of data services.
- Mentor and guide junior engineers, fostering a culture of technical excellence and innovation.
- Ensure data quality, governance, and security standards are upheld across all solutions.
- Troubleshoot and resolve complex data issues and performance bottlenecks.
Key Skills:
- Strong expertise in ETL tools: Pentaho, Talend.
- Experience with data virtualization using Denodo.
- Proficiency in SAS for data analytics and reporting.
- Solid understanding of Agile and Scrum frameworks.
- Hands on experience with DevOps tools and practices (e.g., Jenkins, Git, Docker, Kubernetes).
- Strong SQL and data modelling skills.
- Excellent problem solving, communication, and leadership abilities.
- Proven track record of leading data projects and teams.
- Certifications in Agile/Scrum, DevOps, or relevant data technologies are a plus.
ETL Data Engineer – SC Cleared
Whitehall Resources are currently looking for a ETL Data Engineer – SC Cleared.
You will be required to use an FCSA Accredited Umbrella Company for this role.
Must not have been outside of the UK for more than 6 Months in the last 5 years.
Key Requirements:
- Lead the design, development, and deployment of data integration and transformation solutions using Pentaho, Denodo, Talend, and SAS.
- Architect and implement scalable data pipelines and services that support business intelligence and analytics platforms.
- Collaborate with cross functional teams to gather requirements, define technical specifications, and deliver robust data solutions.
- Champion Agile and Scrum methodologies, ensuring timely delivery of sprints and continuous improvement.
- Drive DevOps practices for CI/CD, automated testing, and deployment of data services.
- Mentor and guide junior engineers, fostering a culture of technical excellence and innovation.
- Ensure data quality, governance, and security standards are upheld across all solutions.
- Troubleshoot and resolve complex data issues and performance bottlenecks.
Key Skills:
- Strong expertise in ETL tools: Pentaho, Talend.
- Experience with data virtualization using Denodo.
- Proficiency in SAS for data analytics and reporting.
- Solid understanding of Agile and Scrum frameworks.
- Hands on experience with DevOps tools and practices (e.g., Jenkins, Git, Docker, Kubernetes).
- Strong SQL and data modelling skills.
- Excellent problem solving, communication, and leadership abilities.
- Proven track record of leading data projects and teams.
- Certifications in Agile/Scrum, DevOps, or relevant data technologies are a plus.






