

TXP
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with strong SAS expertise, preferably with SC clearance. Contract length and pay rate are unspecified. Key skills include SAS 9.4, SQL, data pipeline construction, and Agile/Scrum experience.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 14, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Yes
-
📍 - Location detailed
England, United Kingdom
-
🧠 - Skills detailed
#Jenkins #Data Pipeline #"ETL (Extract #Transform #Load)" #Leadership #Talend #Agile #Scala #GIT #SAS #SQL (Structured Query Language) #Docker #Data Ingestion #DevOps #Datasets #Data Engineering #Security #Oracle #Scrum #Kubernetes #Data Quality
Role description
Senior Data Engineer – SAS
SC clearance is highly preferred
Role Overview:
We are seeking a Senior Data Engineer with strong SAS expertise to join a Scrum team delivering data ingestion, transformation, and analytics solutions within a strategic enterprise-scale platform. The platform integrates complex, high-volume data sources to enable advanced analytics, risk assessment, and operational decision-making.
This role is ideal for candidates with hands-on experience in SAS 9.4 (DI) and SAS Studio. Experience with SAS Viya 4 and SAS RTENG is highly preferred but not mandatory.
Key Responsibilities:
• Design, build, and maintain data ingestion and transformation pipelines using SAS tools.
• Integrate and consolidate data from multiple complex sources into enterprise analytics platforms.
• Translate business requirements into robust, scalable, and high-performing data solutions.
• Apply data quality, governance, and security best practices across all pipelines.
• Optimise workflows for performance and reliability.
• Troubleshoot complex data issues and provide sustainable solutions.
• Collaborate with cross-functional teams and mentor junior engineers.
• Support Agile/Scrum delivery, ensuring timely sprint delivery and continuous improvement.
Skills and Experience:
• Extensive hands-on experience with SAS 9.4 (DI) and SAS Studio.
• Experience with SAS Viya 4 and SAS RTENG is highly preferred.
• Strong SQL and data modelling skills.
• Proven experience building data pipelines in enterprise-scale environments with high-volume, complex datasets.
• Knowledge of ETL tools (Pentaho, Talend) and data virtualisation (Denodo) is desirable.
• Exposure to DevOps tools and practices (Git, Jenkins, Docker, Kubernetes) is advantageous.
• Experience with Oracle is a plus.
• Strong analytical, problem-solving, communication, and leadership abilities.
• Experience working within Agile/Scrum delivery frameworks.
Senior Data Engineer – SAS
SC clearance is highly preferred
Role Overview:
We are seeking a Senior Data Engineer with strong SAS expertise to join a Scrum team delivering data ingestion, transformation, and analytics solutions within a strategic enterprise-scale platform. The platform integrates complex, high-volume data sources to enable advanced analytics, risk assessment, and operational decision-making.
This role is ideal for candidates with hands-on experience in SAS 9.4 (DI) and SAS Studio. Experience with SAS Viya 4 and SAS RTENG is highly preferred but not mandatory.
Key Responsibilities:
• Design, build, and maintain data ingestion and transformation pipelines using SAS tools.
• Integrate and consolidate data from multiple complex sources into enterprise analytics platforms.
• Translate business requirements into robust, scalable, and high-performing data solutions.
• Apply data quality, governance, and security best practices across all pipelines.
• Optimise workflows for performance and reliability.
• Troubleshoot complex data issues and provide sustainable solutions.
• Collaborate with cross-functional teams and mentor junior engineers.
• Support Agile/Scrum delivery, ensuring timely sprint delivery and continuous improvement.
Skills and Experience:
• Extensive hands-on experience with SAS 9.4 (DI) and SAS Studio.
• Experience with SAS Viya 4 and SAS RTENG is highly preferred.
• Strong SQL and data modelling skills.
• Proven experience building data pipelines in enterprise-scale environments with high-volume, complex datasets.
• Knowledge of ETL tools (Pentaho, Talend) and data virtualisation (Denodo) is desirable.
• Exposure to DevOps tools and practices (Git, Jenkins, Docker, Kubernetes) is advantageous.
• Experience with Oracle is a plus.
• Strong analytical, problem-solving, communication, and leadership abilities.
• Experience working within Agile/Scrum delivery frameworks.






