

Kafka Developer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Kafka Developer in Charlotte, NC (Hybrid). It’s a long-term contract with a pay rate of "unknown." Candidates should have 8-10 years in software development, 4+ years Kafka experience, and BFSI domain expertise.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
September 26, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Hybrid
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Charlotte, NC
-
🧠 - Skills detailed
#Security #Scrum #GCP (Google Cloud Platform) #Spark (Apache Spark) #Hadoop #Prometheus #AWS (Amazon Web Services) #Big Data #Compliance #Apache Kafka #Kubernetes #Splunk #Jenkins #Databases #Cloud #Monitoring #SQL (Structured Query Language) #Azure #Docker #NoSQL #Datadog #Data Pipeline #Scala #Kafka (Apache Kafka) #Java #Microservices #Agile #Spring Boot #GitHub
Role description
Job Title: Kafka Developer
Location: Charlotte, NC (Hybrid – 3 days onsite per week)
Duration: Long-Term Contract
Job Summary
Client is seeking a highly skilled Kafka Developer to join its Enterprise Data & Messaging team. This role will focus on designing, building, and maintaining real-time streaming applications leveraging Apache Kafka. The ideal candidate will have strong experience in Java, Spring Boot, microservices, and event-driven architecture, with a proven ability to deliver scalable, resilient, and high-performance data pipelines.
Key Responsibilities
• Design, develop, and maintain Kafka producers, consumers, streams, and connectors.
• Implement real-time data pipelines and event-driven microservices using Kafka and Spring Boot.
• Collaborate with architects and engineers to integrate Kafka solutions with core banking and enterprise applications.
• Manage and optimize Kafka clusters, ensuring scalability, high availability, and fault tolerance.
• Monitor and troubleshoot topics, partitions, consumer groups, and offsets to maintain system health.
• Apply best practices in security, compliance, and governance in BFSI environments.
• Work in Agile/Scrum teams, participate in sprint planning, and ensure CI/CD integration with GitHub, Jenkins, and Harness.
Required Skills
• 8–10 years of software development experience, with a strong focus on Java (Spring Boot, Microservices).
• 4+ years of hands-on Kafka experience (Kafka Streams, Connect, Schema Registry, Confluent or MSK). Strong expertise in event-driven architecture and messaging patterns.
• Solid knowledge of SQL/NoSQL databases and integration with streaming pipelines.
• Experience deploying on cloud platforms (AWS, Azure, or GCP) and container orchestration (Docker, Kubernetes). Familiarity with CI/CD tools (GitHub, Jenkins, Harness, UrbanCode).
• Excellent problem-solving and communication skills. Banking/Financial Services domain experience preferred.
• Nice to Have Exposure to Spark, Hadoop, or other big data ecosystems.
• Experience with monitoring tools (Datadog, Splunk, Prometheus). Kafka certification (Confluent or equivalent) is a plus.
Job Title: Kafka Developer
Location: Charlotte, NC (Hybrid – 3 days onsite per week)
Duration: Long-Term Contract
Job Summary
Client is seeking a highly skilled Kafka Developer to join its Enterprise Data & Messaging team. This role will focus on designing, building, and maintaining real-time streaming applications leveraging Apache Kafka. The ideal candidate will have strong experience in Java, Spring Boot, microservices, and event-driven architecture, with a proven ability to deliver scalable, resilient, and high-performance data pipelines.
Key Responsibilities
• Design, develop, and maintain Kafka producers, consumers, streams, and connectors.
• Implement real-time data pipelines and event-driven microservices using Kafka and Spring Boot.
• Collaborate with architects and engineers to integrate Kafka solutions with core banking and enterprise applications.
• Manage and optimize Kafka clusters, ensuring scalability, high availability, and fault tolerance.
• Monitor and troubleshoot topics, partitions, consumer groups, and offsets to maintain system health.
• Apply best practices in security, compliance, and governance in BFSI environments.
• Work in Agile/Scrum teams, participate in sprint planning, and ensure CI/CD integration with GitHub, Jenkins, and Harness.
Required Skills
• 8–10 years of software development experience, with a strong focus on Java (Spring Boot, Microservices).
• 4+ years of hands-on Kafka experience (Kafka Streams, Connect, Schema Registry, Confluent or MSK). Strong expertise in event-driven architecture and messaging patterns.
• Solid knowledge of SQL/NoSQL databases and integration with streaming pipelines.
• Experience deploying on cloud platforms (AWS, Azure, or GCP) and container orchestration (Docker, Kubernetes). Familiarity with CI/CD tools (GitHub, Jenkins, Harness, UrbanCode).
• Excellent problem-solving and communication skills. Banking/Financial Services domain experience preferred.
• Nice to Have Exposure to Spark, Hadoop, or other big data ecosystems.
• Experience with monitoring tools (Datadog, Splunk, Prometheus). Kafka certification (Confluent or equivalent) is a plus.