Kafka Developer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Kafka Developer with a 12+ month contract, located in Jersey City, Dallas, or Tampa. Key skills include Kafka, Java, Scala, Python, and cloud platforms. Experience in data pipelines and compliance is essential.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 23, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
On-site
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Dallas, TX
🧠 - Skills detailed
#Scala #AWS (Amazon Web Services) #GCP (Google Cloud Platform) #Data Integration #Data Pipeline #Java #JSON (JavaScript Object Notation) #Kafka (Apache Kafka) #Distributed Computing #Compliance #Security #Data Security #Monitoring #Ansible #Cloud #Python #Azure #Deployment
Role description

Role: Senior Kafka Developer

Location: Jersey City , Dallas, Tampa (Day 1 Onsite)

Duration: 12+ months

Job Description:

Responsibilities:

   • Designing, implementing, and managing Kafka-based data pipelines and messaging solutions.

   • Configuring, deploying, and monitoring Kafka clusters to ensure high availability and scalability.

   • Collaborating with cross-functional teams to integrate Kafka into various applications.

   • Troubleshooting and resolving Kafka-related issues.

   • Monitoring and optimizing Kafka cluster performance.

   • Building and maintaining message configurations and flows.

   • Developing and maintaining Kafka-based data pipelines.

   • Ensuring data security compliance.

   • Automating installations and deployments using tools like Ansible.

   • Maintaining performance metrics and tuning clusters as needed.

   • Strong understanding of Kafka architecture.

   • Proficiency in Java, Scala, or Python.

   • Experience with Kafka Streams and KSQL.

   • Knowledge of cloud platforms like AWS, Azure, or GCP.

   • Familiarity with data formats like Avro and JSON.

   • Experience with Schema Registry.

   • Ability to write and deploy Kafka Connect applications.

   • Knowledge of distributed computing concepts.

   • Describe how your Kafka solutions have improved system scalability, reduced latency, or enabled real-time data analytics.

   • Mention specific performance improvements achieved through cluster optimization.

   • Highlight successful data integration efforts between different systems.

   • Quantify the impact of your work on business goals, such as increased efficiency or faster decision-making.