

Kafka Developer
Role: Senior Kafka Developer
Location: Jersey City , Dallas, Tampa (Day 1 Onsite)
Duration: 12+ months
Job Description:
Responsibilities:
• Designing, implementing, and managing Kafka-based data pipelines and messaging solutions.
• Configuring, deploying, and monitoring Kafka clusters to ensure high availability and scalability.
• Collaborating with cross-functional teams to integrate Kafka into various applications.
• Troubleshooting and resolving Kafka-related issues.
• Monitoring and optimizing Kafka cluster performance.
• Building and maintaining message configurations and flows.
• Developing and maintaining Kafka-based data pipelines.
• Ensuring data security compliance.
• Automating installations and deployments using tools like Ansible.
• Maintaining performance metrics and tuning clusters as needed.
• Strong understanding of Kafka architecture.
• Proficiency in Java, Scala, or Python.
• Experience with Kafka Streams and KSQL.
• Knowledge of cloud platforms like AWS, Azure, or GCP.
• Familiarity with data formats like Avro and JSON.
• Experience with Schema Registry.
• Ability to write and deploy Kafka Connect applications.
• Knowledge of distributed computing concepts.
• Describe how your Kafka solutions have improved system scalability, reduced latency, or enabled real-time data analytics.
• Mention specific performance improvements achieved through cluster optimization.
• Highlight successful data integration efforts between different systems.
• Quantify the impact of your work on business goals, such as increased efficiency or faster decision-making.
Role: Senior Kafka Developer
Location: Jersey City , Dallas, Tampa (Day 1 Onsite)
Duration: 12+ months
Job Description:
Responsibilities:
• Designing, implementing, and managing Kafka-based data pipelines and messaging solutions.
• Configuring, deploying, and monitoring Kafka clusters to ensure high availability and scalability.
• Collaborating with cross-functional teams to integrate Kafka into various applications.
• Troubleshooting and resolving Kafka-related issues.
• Monitoring and optimizing Kafka cluster performance.
• Building and maintaining message configurations and flows.
• Developing and maintaining Kafka-based data pipelines.
• Ensuring data security compliance.
• Automating installations and deployments using tools like Ansible.
• Maintaining performance metrics and tuning clusters as needed.
• Strong understanding of Kafka architecture.
• Proficiency in Java, Scala, or Python.
• Experience with Kafka Streams and KSQL.
• Knowledge of cloud platforms like AWS, Azure, or GCP.
• Familiarity with data formats like Avro and JSON.
• Experience with Schema Registry.
• Ability to write and deploy Kafka Connect applications.
• Knowledge of distributed computing concepts.
• Describe how your Kafka solutions have improved system scalability, reduced latency, or enabled real-time data analytics.
• Mention specific performance improvements achieved through cluster optimization.
• Highlight successful data integration efforts between different systems.
• Quantify the impact of your work on business goals, such as increased efficiency or faster decision-making.