

Initialize
AWS Data Engineer - SC Cleared
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS Data Engineer - SC Cleared, remote for 12 months, offering competitive pay. Key skills include ETL development, AWS services, PySpark, Python, SQL, and data governance. Experience in Agile environments is advantageous.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 14, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Yes
-
📍 - Location detailed
United Kingdom
-
🧠 - Skills detailed
#Metadata #Data Governance #Spark (Apache Spark) #AWS Glue #Data Pipeline #Data Processing #Cloud #Databricks #Scala #"ETL (Extract #Transform #Load)" #Version Control #PySpark #SQL (Structured Query Language) #Datasets #Automation #Data Architecture #GitLab #Agile #Data Quality #DevOps #Informatica #S3 (Amazon Simple Storage Service) #Data Management #GitHub #Lambda (AWS Lambda) #AWS (Amazon Web Services) #Data Manipulation #Data Engineering #Python #Data Ingestion
Role description
AWS Data Engineer - SC cleared - remote - 12 months - UK
Key Responsibilities
• Develop, maintain, and optimize ETL pipelines using AWS Glue (Informatica will be beneficial)
• Build and manage cloud-based data pipelines leveraging AWS services (eg, EMR, S3, Lambda, Glue).
• Implement scalable data processing workflows using Databricks, PySpark, Python, and SQL.
• Design and support data ingestion, transformation, and integration processes across structured and unstructured data sources.
• Collaborate with data architects, analysts, and business stakeholders to understand requirements and deliver reliable data solutions.
• Monitor pipeline performance, troubleshoot issues, and ensure data quality and reliability.
• Contribute to best practices for data engineering, including version control, CI/CD, and automation.
Required Skills & Qualifications
• Strong hands-on experience with ETL development and orchestration (AWS).
• Solid AWS cloud experience, including working with core data services.
• Expertise in building distributed data pipelines using EMR, PySpark, or similar technologies.
• Strong data processing and transformation experience across large datasets.
• Proficiency in PySpark, Python, and SQL for data manipulation and automation.
• Understanding of data modelling, data warehousing concepts, and performance optimization.
• Familiarity with CI/CD tools (DevOps, GitHub, GitLab).
• Exposure to data governance, metadata management, and data quality frameworks.
• Experience working in Agile environments is a plus.
AWS Data Engineer - SC cleared - remote - 12 months - UK
Key Responsibilities
• Develop, maintain, and optimize ETL pipelines using AWS Glue (Informatica will be beneficial)
• Build and manage cloud-based data pipelines leveraging AWS services (eg, EMR, S3, Lambda, Glue).
• Implement scalable data processing workflows using Databricks, PySpark, Python, and SQL.
• Design and support data ingestion, transformation, and integration processes across structured and unstructured data sources.
• Collaborate with data architects, analysts, and business stakeholders to understand requirements and deliver reliable data solutions.
• Monitor pipeline performance, troubleshoot issues, and ensure data quality and reliability.
• Contribute to best practices for data engineering, including version control, CI/CD, and automation.
Required Skills & Qualifications
• Strong hands-on experience with ETL development and orchestration (AWS).
• Solid AWS cloud experience, including working with core data services.
• Expertise in building distributed data pipelines using EMR, PySpark, or similar technologies.
• Strong data processing and transformation experience across large datasets.
• Proficiency in PySpark, Python, and SQL for data manipulation and automation.
• Understanding of data modelling, data warehousing concepts, and performance optimization.
• Familiarity with CI/CD tools (DevOps, GitHub, GitLab).
• Exposure to data governance, metadata management, and data quality frameworks.
• Experience working in Agile environments is a plus.





