

DevOps Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a DevOps Engineer on a day-rate contract, fully remote, requiring 3+ years of Kubernetes experience, strong Terraform skills, and AWS proficiency. Python automation and familiarity with data engineering are preferred.
π - Country
United Kingdom
π± - Currency
Β£ GBP
-
π° - Day rate
-
ποΈ - Date discovered
June 27, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Remote
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
United Kingdom
-
π§ - Skills detailed
#Security #Data Engineering #AWS (Amazon Web Services) #Kafka (Apache Kafka) #Kubernetes #S3 (Amazon Simple Storage Service) #Automation #Infrastructure as Code (IaC) #RDS (Amazon Relational Database Service) #Scala #IAM (Identity and Access Management) #Data Processing #Terraform #Deployment #Spark (Apache Spark) #Istio #Cloud #EC2 #DevOps #Python
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Day-Rate Contract - Fully Remote
We are seeking an experienced DevOps Engineer preferably with a strong background & in-depth experience of using Kubernetes.
In this role, you will design, implement, and maintain our containerized infrastructure on AWS, working with cutting-edge technologies to ensure the reliability, scalability, and security of our data platforms.
Key Responsibilities
β’ Design, deploy, and maintain production-grade Kubernetes clusters on AWS
β’ Implement Infrastructure as Code using Terraform for all cloud resources
β’ Manage CI/CD pipelines and GitOps workflows with ArgoCD
β’ Collaborate with data engineering teams to optimize container-based data processing solutions
β’ Develop automation scripts and tools using Python to streamline operations
Essential Skills & Experience
β’ 3+ years of hands-on experience with Kubernetes in production environments
β’ Strong working knowledge of Terraform for infrastructure provisioning
β’ Extensive experience with AWS services (EKS, EC2, S3, RDS, IAM, etc.)
β’ Proficiency in Python for automation and tooling
β’ Experience with ArgoCD or similar GitOps tools
Preferred Qualifications
β’ Background in data engineering or experience supporting data platforms
β’ Familiarity with data processing frameworks (Spark, Kafka, etc.)
β’ Experience with Helm charts for application deployment
β’ Knowledge of service mesh technologies (Istio, Linkerd)