

Confluent Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Confluent Engineer with a contract length of "unknown," offering a pay rate of "unknown." It requires 9+ years of experience, strong skills in Apache Kafka, Java, Python, or Scala, and familiarity with cloud platforms.
π - Country
United Kingdom
π± - Currency
Β£ GBP
-
π° - Day rate
-
ποΈ - Date discovered
August 22, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Remote
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
London, England, United Kingdom
-
π§ - Skills detailed
#Data Warehouse #AWS (Amazon Web Services) #Monitoring #Data Pipeline #Consulting #Kafka (Apache Kafka) #Data Engineering #Docker #Scala #Deployment #Apache Kafka #Prometheus #Cloudera #Big Data #REST (Representational State Transfer) #Grafana #Azure #Kubernetes #Data Lineage #Splunk #DevOps #Data Lake #GCP (Google Cloud Platform) #Java #Cloud #Python #Migration
Role description
Role: Confluent Engineer
Work Mode: Remote
Location : UK
Experience : 9+ Years
As a Confluent Consulting Engineer, you will be responsible for designing, developing, and maintaining scalable real-time data pipelines and integrations using Kafka and Confluent components. You will collaborate with data engineers, architects, and DevOps teams to deliver robust streaming solutions.
Mandatory Skills
β’ 5+ years of hands-on experience with Apache Kafka (any distribution: open-source, Confluent, Cloudera, AWS MSK, etc.)
β’ Strong proficiency in Java, Python, or Scala
β’ Solid understanding of event-driven architecture and data streaming patterns
β’ Experience deploying Kafka on cloud platforms such as AWS, GCP, or Azure
β’ Familiarity with Docker, Kubernetes, and CI/CD pipelines
β’ Excellent problem-solving and communication abilities
Desired Skills
Candidates with experience in Confluent Kafka and its ecosystem will be given preference:
β’ Experience with Kafka Connect, Kafka Streams, KSQL, Schema Registry, REST Proxy, Confluent Control Center
β’ Hands-on with Confluent Cloud services, including ksqlDB Cloud and Apache Flink
β’ Familiarity with Stream Governance, Data Lineage, Stream Catalog, Audit Logs, RBAC
β’ Confluent certifications (Developer, Administrator, or Flink Developer)
β’ Experience with Confluent Platform, Confluent Cloud managed services, multi-cloud deployments, and Confluent for Kubernetes
β’ Knowledge of data mesh architectures, KRaft migration, and modern event streaming patterns
β’ Exposure to monitoring tools (Prometheus, Grafana, Splunk)
β’ Experience with data lakes, data warehouses, or big data ecosystems
Personal
β’ High analytical skills
β’ A high degree of initiative and flexibility
β’ High customer orientation
β’ High quality awareness
β’ Excellent verbal and written communication skills
Role: Confluent Engineer
Work Mode: Remote
Location : UK
Experience : 9+ Years
As a Confluent Consulting Engineer, you will be responsible for designing, developing, and maintaining scalable real-time data pipelines and integrations using Kafka and Confluent components. You will collaborate with data engineers, architects, and DevOps teams to deliver robust streaming solutions.
Mandatory Skills
β’ 5+ years of hands-on experience with Apache Kafka (any distribution: open-source, Confluent, Cloudera, AWS MSK, etc.)
β’ Strong proficiency in Java, Python, or Scala
β’ Solid understanding of event-driven architecture and data streaming patterns
β’ Experience deploying Kafka on cloud platforms such as AWS, GCP, or Azure
β’ Familiarity with Docker, Kubernetes, and CI/CD pipelines
β’ Excellent problem-solving and communication abilities
Desired Skills
Candidates with experience in Confluent Kafka and its ecosystem will be given preference:
β’ Experience with Kafka Connect, Kafka Streams, KSQL, Schema Registry, REST Proxy, Confluent Control Center
β’ Hands-on with Confluent Cloud services, including ksqlDB Cloud and Apache Flink
β’ Familiarity with Stream Governance, Data Lineage, Stream Catalog, Audit Logs, RBAC
β’ Confluent certifications (Developer, Administrator, or Flink Developer)
β’ Experience with Confluent Platform, Confluent Cloud managed services, multi-cloud deployments, and Confluent for Kubernetes
β’ Knowledge of data mesh architectures, KRaft migration, and modern event streaming patterns
β’ Exposure to monitoring tools (Prometheus, Grafana, Splunk)
β’ Experience with data lakes, data warehouses, or big data ecosystems
Personal
β’ High analytical skills
β’ A high degree of initiative and flexibility
β’ High customer orientation
β’ High quality awareness
β’ Excellent verbal and written communication skills