Delta System & Software, Inc.

Kafka Developer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Kafka Developer in New York City, NY, with a contract length of unspecified duration. The pay rate is also unspecified. Candidates require 5+ years of Kafka administration experience, proficiency in Python/Bash, and strong Linux skills.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
November 19, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
New York, NY
-
🧠 - Skills detailed
#Prometheus #Observability #Automation #Computer Science #Terraform #Kafka (Apache Kafka) #Kerberos #Compliance #Bash #Data Governance #Data Engineering #Cloud #DevOps #Monitoring #Deployment #Grafana #Linux #Replication #Data Pipeline #Infrastructure as Code (IaC) #AWS (Amazon Web Services) #Scripting #GCP (Google Cloud Platform) #Azure #Security #Python
Role description
Role: Kafka Administrator (Confluent Platform) Location: New York City, NY - 10004, Day one onsite (3 days on-site per week)- – Non locals will work from EST/CST Job Responsibilities β€’ Deploy, configure, and maintain Confluent Kafka clusters in a highly available production environment. β€’ Execute Kafka upgrades and patching with minimal downtime and smooth transitions. β€’ Administer security protocols and access controls to protect the platform and data. β€’ Monitor cluster health and performance using the organization’s monitoring stack (Prometheus, Grafana) and respond to anomalies. β€’ Troubleshoot and resolve cluster-related incidents (e.g., broker outages, replication lag, partition reassignment, data loss risks). β€’ Develop and enhance automation scripts to streamline operational tasks and reduce manual interventions. β€’ Assist with capacity planning and resource management to support future growth (topics, partitions, quotas, and hardware requirements). β€’ Collaborate with SRE/DevOps, data engineers, and application teams to optimize data pipelines and performance. Qualifications β€’ Bachelor’s degree in Computer Science, Information Technology, or an equivalent combination of education and experience. β€’ 5+ years of Kafka administration experience, preferably with Confluent Platform in production environments. β€’ Proven expertise in Kafka upgrades, production troubleshooting, and incident response. β€’ Strong background in Linux system administration. β€’ Proficiency in Python and/or Bash scripting for automation and tooling. β€’ Solid understanding of Prometheus and Grafana for monitoring and observability; experience creating dashboards and alerts. β€’ Familiarity with security best practices for Kafka (ACLs, Kerberos/SASL, TLS) is a plus. β€’ Confluent Kafka certification is advantageous. Nice-to-have β€’ Experience with Kafka Connect, KSQL/ksqlDB, and Schema Registry. β€’ Familiarity with cloud deployments (AWS, GCP, or Azure) and infrastructure as code (e.g., Terraform). β€’ Knowledge of data governance and compliance considerations related to streaming platforms.