

Confluent Kafka/Connector
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Confluent Kafka/Connector specialist for a contract of "X months" at a pay rate of "$Y/hour". Key skills required include hands-on experience with the Confluent Kafka ecosystem, HTTP connectors, and proficiency in Java/Python.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
September 20, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Kafka (Apache Kafka) #AWS (Amazon Web Services) #Debugging #Java #Microservices #S3 (Amazon Simple Storage Service) #Scala #Cloud #Kubernetes #REST (Representational State Transfer) #Monitoring #JDBC (Java Database Connectivity) #Python #Azure #Consulting #GCP (Google Cloud Platform) #REST API #Docker
Role description
Responsibilities
β’ Design, configure, and maintain Confluent Kafka clusters.
β’ Implement and manage HTTP source/sink connectors for real-time data flow.
β’ Troubleshoot and optimize Kafka connectors and integrations with REST APIs.
β’ Collaborate with teams to build scalable, event-driven architectures.
β’ Provide best practices and consulting support on Confluent platform.
Required Skills
β’ Hands-on experience with Confluent Kafka ecosystem (Kafka Connect, Schema Registry, KSQL).
β’ Strong expertise in HTTP connectors (source/sink).
β’ Solid knowledge of distributed systems & event streaming.
β’ Experience in Java/Python for debugging and customization.
β’ Familiarity with cloud (AWS/Azure/GCP) and Docker/Kubernetes.
Nice to Have
β’ Experience with other connectors (JDBC, S3, Debezium).
β’ Knowledge of Confluent Control Center & monitoring tools.
β’ Exposure to microservices architecture.
Responsibilities
β’ Design, configure, and maintain Confluent Kafka clusters.
β’ Implement and manage HTTP source/sink connectors for real-time data flow.
β’ Troubleshoot and optimize Kafka connectors and integrations with REST APIs.
β’ Collaborate with teams to build scalable, event-driven architectures.
β’ Provide best practices and consulting support on Confluent platform.
Required Skills
β’ Hands-on experience with Confluent Kafka ecosystem (Kafka Connect, Schema Registry, KSQL).
β’ Strong expertise in HTTP connectors (source/sink).
β’ Solid knowledge of distributed systems & event streaming.
β’ Experience in Java/Python for debugging and customization.
β’ Familiarity with cloud (AWS/Azure/GCP) and Docker/Kubernetes.
Nice to Have
β’ Experience with other connectors (JDBC, S3, Debezium).
β’ Knowledge of Confluent Control Center & monitoring tools.
β’ Exposure to microservices architecture.