KAnand Corporation

Confluent Kafka Connect with HTTP Source Sink Connectors

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Confluent Kafka Connect specialist with 10+ years of experience. It is a 12-month remote contract, paying competitively. Key skills include Kafka, middleware, HTTP connectors, Java/Python, and cloud familiarity (AWS/Azure/GCP).
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 7, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#S3 (Amazon Simple Storage Service) #AWS (Amazon Web Services) #Azure #Monitoring #REST (Representational State Transfer) #Cloud #Debugging #Java #Scala #Microservices #GCP (Google Cloud Platform) #Python #Kafka (Apache Kafka) #Docker #REST API #JDBC (Java Database Connectivity) #Consulting #Kubernetes
Role description
Position : Confluent Kafka Connect with HTTP Source Sink Connectors Location : USA -Remote Duration : Contract : 12 Months Experience :10 Must Have Skills: • Kafka • Middleware • Confluent Kafka Required Skills Hands-on experience with Confluent Kafka ecosystem (Kafka Connect, Schema Registry, KSQL). Strong expertise in HTTP connectors (source/sink). Solid knowledge of distributed systems & event streaming. Experience in Java/Python for debugging and customization. Familiarity with cloud (AWS/Azure/GCP) and Docker/Kubernetes. Nice to Have Experience with other connectors (JDBC, S3, Debezium). Knowledge of Confluent Control Center & monitoring tools. Exposure to microservices architecture. Responsibilities Design, configure, and maintain Confluent Kafka clusters. Implement and manage HTTP source/sink connectors for real-time data flow. Troubleshoot and optimize Kafka connectors and integrations with REST APIs. Collaborate with teams to build scalable, event-driven architectures. Provide best practices and consulting support on Confluent platform.