

Senior Kafka Developer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Kafka Developer in Phenix, Arizona, for 12 months at a competitive pay rate. Requires 3-5 years of Kafka experience, proficiency in Java or Python, and knowledge of Avro/Protobuf schema management.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
June 11, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Hybrid
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Phoenix, AZ
-
π§ - Skills detailed
#Debugging #Data Quality #AWS (Amazon Web Services) #Data Pipeline #GCP (Google Cloud Platform) #Python #Scala #Cloud #Monitoring #Kafka (Apache Kafka) #Prometheus #Azure #Grafana #Java #Apache Kafka
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title: Senior Kafka Developer
Location: Phenix, Arizona (Hybrid)
Duration: 12 Months
Job Description:
We are looking for an experienced Kafka Developer with 3 5 years of hands-on experience in building and maintaining real-time data streaming pipelines using Apache Kafka.
Key Responsibilities:
β’ Design, develop, and maintain scalable and reliable Kafka-based data pipelines.
β’ Work with Kafka producer/consumer APIs, Kafka Streams, and Kafka Connect to integrate with various data sources and sinks.
β’ Implement schema management using Avro or Protobuf.
β’ Ensure data quality, integrity, and efficient processing in real-time environments.
β’ Collaborate with cross-functional teams to understand business requirements and deliver technical solutions.
β’ Monitor and debug Kafka clusters to ensure high performance and reliability.
β’ Perform performance tuning and optimization of data pipelines.
Required Skills:
β’ 3 5 years of hands-on experience with Apache Kafka
β’ Proficiency in Java or Python
β’ Experience with:
β’ Kafka Streams
β’ Kafka Connect
β’ Kafka Producer and Consumer APIs
β’ Strong knowledge of Avro/Protobuf schema management
β’ Solid understanding of Kafka internals and architecture
β’ Experience integrating Kafka with other data systems
Preferred Skills (Nice to Have):
β’ Exposure to cloud platforms such as AWS, Google Cloud Platform, or Azure
β’ Familiarity with monitoring tools:
β’ Prometheus
β’ Grafana
β’ Confluent Control Center
β’ Strong debugging and performance tuning skills
β’ Excellent problem-solving and communication abilities