

KAnand Corporation
Confluent Kafka/Connector
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Confluent Kafka/Connector specialist, offering a contract length of "X months" at a pay rate of "$Y/hour". Requires hands-on experience with Confluent Kafka, HTTP connectors, distributed systems, Java/Python, and familiarity with cloud services and Docker/Kubernetes.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
September 30, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#REST (Representational State Transfer) #JDBC (Java Database Connectivity) #Python #Azure #Kafka (Apache Kafka) #Microservices #S3 (Amazon Simple Storage Service) #Consulting #Kubernetes #Monitoring #Scala #AWS (Amazon Web Services) #REST API #Docker #GCP (Google Cloud Platform) #Debugging #Java #Cloud
Role description
Responsibilities
• Design, configure, and maintain Confluent Kafka clusters.
• Implement and manage HTTP source/sink connectors for real-time data flow.
• Troubleshoot and optimize Kafka connectors and integrations with REST APIs.
• Collaborate with teams to build scalable, event-driven architectures.
• Provide best practices and consulting support on Confluent platform.
Required Skills
• Hands-on experience with Confluent Kafka ecosystem (Kafka Connect, Schema Registry, KSQL).
• Strong expertise in HTTP connectors (source/sink).
• Solid knowledge of distributed systems & event streaming.
• Experience in Java/Python for debugging and customization.
• Familiarity with cloud (AWS/Azure/GCP) and Docker/Kubernetes.
Nice to Have
• Experience with other connectors (JDBC, S3, Debezium).
• Knowledge of Confluent Control Center & monitoring tools.
• Exposure to microservices architecture.
Responsibilities
• Design, configure, and maintain Confluent Kafka clusters.
• Implement and manage HTTP source/sink connectors for real-time data flow.
• Troubleshoot and optimize Kafka connectors and integrations with REST APIs.
• Collaborate with teams to build scalable, event-driven architectures.
• Provide best practices and consulting support on Confluent platform.
Required Skills
• Hands-on experience with Confluent Kafka ecosystem (Kafka Connect, Schema Registry, KSQL).
• Strong expertise in HTTP connectors (source/sink).
• Solid knowledge of distributed systems & event streaming.
• Experience in Java/Python for debugging and customization.
• Familiarity with cloud (AWS/Azure/GCP) and Docker/Kubernetes.
Nice to Have
• Experience with other connectors (JDBC, S3, Debezium).
• Knowledge of Confluent Control Center & monitoring tools.
• Exposure to microservices architecture.