Simon James IT Ltd

Data Engineer - 12 Months, Remote - SC Cleared

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer position for a 12-month contract, fully remote, requiring SC clearance. Key skills include ETL/ELT development, PySpark, Python, SQL, and AWS EMR. Strong experience in cloud-native data processing and pipeline engineering is essential.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 14, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Yes
-
📍 - Location detailed
England, United Kingdom
-
🧠 - Skills detailed
#Deployment #ADF (Azure Data Factory) #Monitoring #Data Processing #PySpark #SQL (Structured Query Language) #AWS (Amazon Web Services) #Informatica #Data Accuracy #AWS Glue #Data Engineering #Data Pipeline #Cloud #AWS EMR (Amazon Elastic MapReduce) #Python #Spark (Apache Spark) #"ETL (Extract #Transform #Load)"
Role description
Data Engineer – ETL/ELT Development 12‑Month Contract (Fully Remote) We're seeking a Data Engineer with strong hands‑on experience in ETL development, AWS Cloud services, and large‑scale data pipelines using EMR and PySpark. This fully remote role focuses on building, optimising, and maintaining modern data processing workflows across cloud environments. Core Tech Requirements • ETL/ELT pipeline engineering (Informatica, ADF, AWS Glue or similar) • PySpark, Python, SQL expertise • AWS EMR and cloud‑native data processing experience • Strong data processing and quality best‑practice knowledge What You’ll Do • Build and optimise large‑scale ETL/ELT pipelines in AWS • Develop and tune PySpark jobs for EMR workloads • Improve performance, reliability, and cost efficiency across data workflows • Ensure data accuracy and consistency across ingestion and processing layers • Support orchestration, monitoring, and deployments in AWS environments • Troubleshoot production issues with a focus on long‑term stability