
Kafka Developer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Kafka Developer on a contract lasting more than 6 months, offering $60.00 - $70.00 per hour. Key skills required include 5–8 years of experience with Apache Kafka, KStream, KSQL, and cloud-native deployments (AWS/GCP).
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
560
-
🗓️ - Date discovered
September 20, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
On-site
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Quincy, MA 02269
-
🧠 - Skills detailed
#Kafka (Apache Kafka) #IoT (Internet of Things) #Deployment #Cloud #"ETL (Extract #Transform #Load)" #JDBC (Java Database Connectivity) #Dynatrace #Elasticsearch #Replication #Java #S3 (Amazon Simple Storage Service) #Scala #Python #Ansible #GCP (Google Cloud Platform) #Prometheus #AWS (Amazon Web Services) #Disaster Recovery #Data Engineering #JSON (JavaScript Object Notation) #Terraform #Kubernetes #REST (Representational State Transfer) #Grafana #Apache Kafka #Libraries #Security #Microservices #Data Pipeline
Role description
Confluent Kafka Developer / Streaming Data Engineer
• Experience Level:
• 5–8 Years
###
• Job Summary:
• We are looking for a highly skilled
• Confluent Kafka Developer / Streaming Engineer
• with 5–8 years of experience in building and maintaining real-time streaming platforms using
• Confluent Kafka. This role involves working with
•
• Kafka Streams (KStream),
•
• KSQL/ksqlDB,
•
• Kafka Connect
• , and managing Confluent-based infrastructure to enable scalable, secure, and resilient event-driven systems.
• Key Responsibilities:
•
• Kafka Stream Processing
•
• Design and develop real-time applications using
• Kafka Streams (KStream)
• for stateless and stateful transformations, windowing, and joins.
• Create
• KSQL/ksqlDB
• streams and tables for on-the-fly analytics and streaming ETL use cases.
• Optimize streaming pipelines for
• throughput,
•
• latency, and
•
• exactly-once processing guarantees
• .
• Confluent Platform Integration
•
• Implement
• Kafka Connect
• for source/sink connectors (e.g., JDBC, S3, Elasticsearch).
• Manage
• Schema Registry
• for Avro/JSON/Protobuf schema evolution with full compatibility controls.
• Utilize
• Confluent Control Center
• for visibility into throughput, lag, and health of data pipelines.
• Provision and maintain
• Confluent Kafka clusters
• on
• Kubernetes,
•
• OpenShift, or
•
• AWS/GCP
• using
• Helm,
•
• Terraform, or
•
• Ansible
• .
• Configure
• multi-region replication,
•
• disaster recovery, and
•
• mirror-maker 2.0
• .
• Monitor and troubleshoot clusters using
• Prometheus,
•
• Grafana,
•
• Confluent Metrics Reporter, or third-party tools (e.g.,
•
• Dynatrace,
•
• Alma
• ).
• Required Skills & Experience:
•
• 5–8 years of experience with
• Apache Kafka, including 2–4 years on
•
• Confluent Kafka Platform
• .
• Strong hands-on experience with
• KStream,
•
• KSQL, and
•
• Kafka Connect
• .
• Proficiency in
• Java,
•
• Python, or
•
• Scala
• for streaming app development.
• Familiarity with
• distributed systems,
•
• schema evolution,
•
• data consistency, and
•
• idempotency
• .
• Cloud-native Kafka deployments experience (AWS MSK, Confluent Cloud, or Kubernetes).
• Strong knowledge of
• Kafka security,
•
• topic design, and
•
• capacity planning
• .
###
• Preferred Qualifications:
•
• Confluent Kafka certification (Developer/Admin) is a plus.
• Experience in
• event-driven microservices,
•
• IoT, or
•
• real-time analytics
• .
• Familiarity with
• Confluent REST Proxy,
•
• Kafka Streams testing libraries, and
•
• ksqlDB UDF/UDAF
• .
Job Type: Contract
Pay: $60.00 - $70.00 per hour
Work Location: In person
Confluent Kafka Developer / Streaming Data Engineer
• Experience Level:
• 5–8 Years
###
• Job Summary:
• We are looking for a highly skilled
• Confluent Kafka Developer / Streaming Engineer
• with 5–8 years of experience in building and maintaining real-time streaming platforms using
• Confluent Kafka. This role involves working with
•
• Kafka Streams (KStream),
•
• KSQL/ksqlDB,
•
• Kafka Connect
• , and managing Confluent-based infrastructure to enable scalable, secure, and resilient event-driven systems.
• Key Responsibilities:
•
• Kafka Stream Processing
•
• Design and develop real-time applications using
• Kafka Streams (KStream)
• for stateless and stateful transformations, windowing, and joins.
• Create
• KSQL/ksqlDB
• streams and tables for on-the-fly analytics and streaming ETL use cases.
• Optimize streaming pipelines for
• throughput,
•
• latency, and
•
• exactly-once processing guarantees
• .
• Confluent Platform Integration
•
• Implement
• Kafka Connect
• for source/sink connectors (e.g., JDBC, S3, Elasticsearch).
• Manage
• Schema Registry
• for Avro/JSON/Protobuf schema evolution with full compatibility controls.
• Utilize
• Confluent Control Center
• for visibility into throughput, lag, and health of data pipelines.
• Provision and maintain
• Confluent Kafka clusters
• on
• Kubernetes,
•
• OpenShift, or
•
• AWS/GCP
• using
• Helm,
•
• Terraform, or
•
• Ansible
• .
• Configure
• multi-region replication,
•
• disaster recovery, and
•
• mirror-maker 2.0
• .
• Monitor and troubleshoot clusters using
• Prometheus,
•
• Grafana,
•
• Confluent Metrics Reporter, or third-party tools (e.g.,
•
• Dynatrace,
•
• Alma
• ).
• Required Skills & Experience:
•
• 5–8 years of experience with
• Apache Kafka, including 2–4 years on
•
• Confluent Kafka Platform
• .
• Strong hands-on experience with
• KStream,
•
• KSQL, and
•
• Kafka Connect
• .
• Proficiency in
• Java,
•
• Python, or
•
• Scala
• for streaming app development.
• Familiarity with
• distributed systems,
•
• schema evolution,
•
• data consistency, and
•
• idempotency
• .
• Cloud-native Kafka deployments experience (AWS MSK, Confluent Cloud, or Kubernetes).
• Strong knowledge of
• Kafka security,
•
• topic design, and
•
• capacity planning
• .
###
• Preferred Qualifications:
•
• Confluent Kafka certification (Developer/Admin) is a plus.
• Experience in
• event-driven microservices,
•
• IoT, or
•
• real-time analytics
• .
• Familiarity with
• Confluent REST Proxy,
•
• Kafka Streams testing libraries, and
•
• ksqlDB UDF/UDAF
• .
Job Type: Contract
Pay: $60.00 - $70.00 per hour
Work Location: In person