

Kafka Developer/Architect
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Kafka Developer/Architect on a contract-to-hire basis, offering a competitive pay rate. Key skills include Confluent Kafka, Terraform, Java/Python/Scala, and cloud platforms (AWS/Azure/GCP). Remote work is available; must work on Planet's W2.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
July 23, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Remote
-
📄 - Contract type
W2 Contractor
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Edina, MN
-
🧠 - Skills detailed
#Infrastructure as Code (IaC) #Automation #Security #Kafka (Apache Kafka) #Scala #Strategy #Deployment #Cloud #Python #AWS (Amazon Web Services) #Data Pipeline #Data Integrity #Apache Kafka #Terraform #Azure #Programming #Java #Data Processing #GCP (Google Cloud Platform)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
NO C2C HELP IN ANY FORM
NO Sponsorship Available
This is a contract to hire role so all candidates must be able to work on Planet's W2 without restriction
Job Summary
We are seeking a highly skilled Senior Kafka Platform Engineer to play a pivotal role in scaling our secure, cloud-native Confluent Kafka platform. This position is central to our data infrastructure strategy, responsible for developing and managing high-performance, real-time data streaming pipelines that power mission-critical applications across the organization.
Key Responsibilities
• Design, develop, and maintain scalable, reliable data pipelines using Confluent Kafka.
• Manage and optimize our Kafka-based streaming infrastructure to ensure high availability and performance.
• Implement Infrastructure as Code (IaC) using Terraform to automate deployment and configuration processes.
• Collaborate with cross-functional teams to integrate Kafka solutions into broader data and application architectures.
• Monitor, troubleshoot, and resolve issues in the Kafka ecosystem, ensuring data integrity and system resilience.
• Contribute to the evolution of our cloud-native architecture with a focus on scalability, security, and automation.
Qualifications
• Proven experience designing and managing Confluent Kafka or Apache Kafka in a production environment.
• Strong proficiency in at least one programming language: Java, Python, or Scala.
• Hands-on experience with Terraform and other IaC tools for cloud infrastructure automation.
• Deep understanding of distributed systems, event-driven architecture, and real-time data processing.
• Experience with cloud platforms such as AWS, Azure, or GCP.
• Excellent problem-solving skills and the ability to work independently in a fast-paced environment.
#TECH
#REMOTE
#EST
#C2H
#contract to hire
#w2