

BlueRose Technologies
Confluent / Kafka Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Confluent/Kafka Engineer on a remote contract basis for 6 months, paying "£X per hour". Requires 5+ years of Kafka experience, strong Java/Python/Scala skills, cloud deployment expertise, and familiarity with Docker/Kubernetes.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 25, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United Kingdom
-
🧠 - Skills detailed
#Monitoring #Scala #Consulting #Data Lake #Programming #REST (Representational State Transfer) #Deployment #Docker #Kafka (Apache Kafka) #Kubernetes #Migration #Cloud #Splunk #Java #Data Pipeline #Apache Kafka #Grafana #Data Engineering #DevOps #Azure #GCP (Google Cloud Platform) #AWS (Amazon Web Services) #Python
Role description
•
•
•
•
•
•
•
•
• We're Hiring
•
•
•
•
•
•
•
•
• Confluent/Kafka Consulting Engineer
📍 Location: Remote (Work from anywhere in the UK- The candidate must be based in the UK for the duration of the contract)
📄 Type: Contract
Are you passionate about real-time data streaming and modern event-driven architectures?
We’re looking for an experienced Confluent/Kafka Consulting Engineer to join our team on a remote basis.
In this role, you’ll work with a talented group of data engineers, architects, and DevOps professionals to design and implement scalable, high-performance streaming solutions using Apache Kafka and the Confluent ecosystem.
🔧 What You'll Do:
• Build and maintain real-time data pipelines and integrations
• Collaborate with cross-functional teams to deliver robust Kafka-based solutions
• Deploy and manage Kafka across cloud platforms like AWS, GCP, or Azure
• Design event-driven systems that power critical business processes
✅ Must-Have Skills:
• 5+ years of hands-on experience with Apache Kafka (any distribution)
• Strong programming skills in Java, Python, or Scala
• Deep understanding of event-driven architectures
• Experience with cloud deployment (AWS, GCP, Azure)
• Familiarity with Docker, Kubernetes, and CI/CD pipelines
• Excellent communication skills
⭐ Nice-to-Have Skills:
• Proficiency in Confluent Kafka and tools like:
• Kafka Connect, Kafka Streams, ksqlDB, Schema Registry, REST Proxy, Control Center
• Experience with Confluent Cloud, Apache Flink, and KRaft migration
• Familiarity with Stream Governance, RBAC, Audit Logs
• Confluent certifications (Developer, Administrator, Flink Developer)
• Exposure to data mesh, data lakes, and monitoring tools like Grafana or Splunk
🌍 Why Join Us?
• 100% remote flexibility within the UK
• Work on cutting-edge streaming technologies
• Collaborate with a passionate and expert-level team
• Make an impact in large-scale, real-world data systems
•
•
•
• In light of project-related limitations, only applicants holding a permanent visa will be considered for this position. Temporary visa holders are not eligible at this time
•
•
•
• 🎯 Ready to take your Kafka skills to the next level?
Apply now or message us directly to learn more!
#Kafka #Confluent #StreamingData #RemoteJobs #DataEngineering #EventDrivenArchitecture #UKJobs #LondonJobs #ApacheFlink #DataMesh #DevOps
•
•
•
•
•
•
•
•
• We're Hiring
•
•
•
•
•
•
•
•
• Confluent/Kafka Consulting Engineer
📍 Location: Remote (Work from anywhere in the UK- The candidate must be based in the UK for the duration of the contract)
📄 Type: Contract
Are you passionate about real-time data streaming and modern event-driven architectures?
We’re looking for an experienced Confluent/Kafka Consulting Engineer to join our team on a remote basis.
In this role, you’ll work with a talented group of data engineers, architects, and DevOps professionals to design and implement scalable, high-performance streaming solutions using Apache Kafka and the Confluent ecosystem.
🔧 What You'll Do:
• Build and maintain real-time data pipelines and integrations
• Collaborate with cross-functional teams to deliver robust Kafka-based solutions
• Deploy and manage Kafka across cloud platforms like AWS, GCP, or Azure
• Design event-driven systems that power critical business processes
✅ Must-Have Skills:
• 5+ years of hands-on experience with Apache Kafka (any distribution)
• Strong programming skills in Java, Python, or Scala
• Deep understanding of event-driven architectures
• Experience with cloud deployment (AWS, GCP, Azure)
• Familiarity with Docker, Kubernetes, and CI/CD pipelines
• Excellent communication skills
⭐ Nice-to-Have Skills:
• Proficiency in Confluent Kafka and tools like:
• Kafka Connect, Kafka Streams, ksqlDB, Schema Registry, REST Proxy, Control Center
• Experience with Confluent Cloud, Apache Flink, and KRaft migration
• Familiarity with Stream Governance, RBAC, Audit Logs
• Confluent certifications (Developer, Administrator, Flink Developer)
• Exposure to data mesh, data lakes, and monitoring tools like Grafana or Splunk
🌍 Why Join Us?
• 100% remote flexibility within the UK
• Work on cutting-edge streaming technologies
• Collaborate with a passionate and expert-level team
• Make an impact in large-scale, real-world data systems
•
•
•
• In light of project-related limitations, only applicants holding a permanent visa will be considered for this position. Temporary visa holders are not eligible at this time
•
•
•
• 🎯 Ready to take your Kafka skills to the next level?
Apply now or message us directly to learn more!
#Kafka #Confluent #StreamingData #RemoteJobs #DataEngineering #EventDrivenArchitecture #UKJobs #LondonJobs #ApacheFlink #DataMesh #DevOps






