

Global IT Family LLC
Kafka Developer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Kafka Developer in Whippany, NJ, on a W2 contract. Required skills include 8+ years in Java and Apache Kafka, data pipeline development, microservices, and database experience. Familiarity with big data technologies is preferred.
🌎 - Country
United States
💱 - Currency
Unknown
-
💰 - Day rate
Unknown
-
🗓️ - Date
December 4, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Whippany, NJ 07981
-
🧠 - Skills detailed
#Oracle #Scala #Zookeeper #Data Pipeline #Hadoop #Databases #Kafka (Apache Kafka) #JPA (Java Persistence API) #Microservices #Snowflake #Spark (Apache Spark) #Spring Boot #PostgreSQL #Security #Compliance #Java #Big Data #Apache Kafka
Role description
Role : Kafka Developer
Location : Whippany - NJ (Onsite)
Type : W2
Job Description:
8+ years skilled and experienced in design, development, and maintaining real-time data streaming applications using Java and Apache Kafka.
Data pipeline development: Design, build, and maintain real-time data streaming pipelines using Apache Kafka and Java.
Kafka application development: Develop and implement Kafka producer and consumer applications, including microservices.
Integration: Integrate Kafka with various existing systems, databases, and data sources.
Performance optimization: Monitor and troubleshoot Kafka cluster performance, topics, and brokers for efficiency and scalability.
Maintenance and support: Support and upgrade existing Kafka implementations and troubleshoot issues.
Security and compliance: Implement security measures to protect data streams and meet compliance standards.
To be skilled and experienced in:
Core Java: Strong proficiency in Java, including concepts like multi-threading, concurrency, and collections.
Apache Kafka: Deep knowledge of Kafka architecture, features, and components like Kafka Streams, ZooKeeper, and Schema Registry.
Messaging and streaming: Experience with messaging and stream processing on Kafka.
Microservices: Experience with microservices architecture, as many Kafka applications are built as microservices.
Databases: Experience with databases and data warehousing technologies (e.g., PostgreSQL, Oracle) is often required or preferred
Frameworks: Experience with frameworks like Spring (Spring Boot, Spring JPA) and ORM frameworks.
Big data technologies: Familiarity with big data technologies like Spark, Hadoop, or Snowflake.
Job Type: Contract
Application Question(s):
Are you comfortable on W2 ?
LinkedIn URL :
Visa Status :
How many years of experience with Apache Kafka ? Must in resume
Work Location: In person
Role : Kafka Developer
Location : Whippany - NJ (Onsite)
Type : W2
Job Description:
8+ years skilled and experienced in design, development, and maintaining real-time data streaming applications using Java and Apache Kafka.
Data pipeline development: Design, build, and maintain real-time data streaming pipelines using Apache Kafka and Java.
Kafka application development: Develop and implement Kafka producer and consumer applications, including microservices.
Integration: Integrate Kafka with various existing systems, databases, and data sources.
Performance optimization: Monitor and troubleshoot Kafka cluster performance, topics, and brokers for efficiency and scalability.
Maintenance and support: Support and upgrade existing Kafka implementations and troubleshoot issues.
Security and compliance: Implement security measures to protect data streams and meet compliance standards.
To be skilled and experienced in:
Core Java: Strong proficiency in Java, including concepts like multi-threading, concurrency, and collections.
Apache Kafka: Deep knowledge of Kafka architecture, features, and components like Kafka Streams, ZooKeeper, and Schema Registry.
Messaging and streaming: Experience with messaging and stream processing on Kafka.
Microservices: Experience with microservices architecture, as many Kafka applications are built as microservices.
Databases: Experience with databases and data warehousing technologies (e.g., PostgreSQL, Oracle) is often required or preferred
Frameworks: Experience with frameworks like Spring (Spring Boot, Spring JPA) and ORM frameworks.
Big data technologies: Familiarity with big data technologies like Spark, Hadoop, or Snowflake.
Job Type: Contract
Application Question(s):
Are you comfortable on W2 ?
LinkedIn URL :
Visa Status :
How many years of experience with Apache Kafka ? Must in resume
Work Location: In person






