

Data Engineer (SC Cleared)
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (SC Cleared) with a 12-month contract, mostly remote but requires access to London or Bristol. Pay is negotiable, inside IR35. Key skills include Python, SQL, Apache Airflow, AWS, and Kafka.
π - Country
United Kingdom
π± - Currency
Β£ GBP
-
π° - Day rate
-
ποΈ - Date discovered
August 7, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
Inside IR35
-
π - Security clearance
Yes
-
π - Location detailed
Guildford, England, United Kingdom
-
π§ - Skills detailed
#Airflow #Spark (Apache Spark) #SQL (Structured Query Language) #Apache Airflow #Automated Testing #Schema Design #Cloud #Kafka (Apache Kafka) #Batch #Apache Spark #S3 (Amazon Simple Storage Service) #Compliance #AWS S3 (Amazon Simple Storage Service) #Data Lake #Deployment #Docker #Python #AWS (Amazon Web Services) #DevOps #Security #Version Control #Data Pipeline #Data Engineering #"ETL (Extract #Transform #Load)" #Kubernetes #Data Processing #Big Data #Redshift
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Description
Data Engineer (SC cleared)
Start: ASAP
Duration: 12 months
Location: Mostly Remote - must have access to London or Bristol
Pay: negotiable, INSIDE IR35
Responsibilities
β’ Design, implement robust ETL/ELT data pipelines using Apache Airflow
β’ Build ingestion processes from internal systems and APIs, using Kafka, Spark, AWS
β’ Develop and maintain data lakes and warehouses (AWS S3, Redshift)
β’ Ensuring governance using automated testing tools
β’ Collaborate with DevOps to manage CI/CD pipelines for data deployments and ensure version control of DAGs
β’ Apply best practice in security and compliance
Required Tech Skills
β’ Python and SQL for processing
β’ Apache Airflow, writing Airflow DAGs and configuring airflow jobs
β’ AWS cloud platform and services like S3, Redshift
β’ Familiarity with big data processing using Apache Spark
β’ Knowledge of modelling, schema design and partitioning strategies
β’ Understanding batch Vs streaming data paradigms
β’ Docker or Kubernetes (containerization)