

Arjava Technologies
Confluent Kafka Engineer (In Person and US - GC /Citizenship Required )
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Confluent Kafka Engineer in Woodlawn, MD, offering $85.00 - $90.00 per hour for a contract lasting over 6 months. Requires 10+ years in software development, expertise in Confluent Kafka, and proficiency in Java or Python.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
720
-
🗓️ - Date
February 11, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Woodlawn, MD 21207
-
🧠 - Skills detailed
#Python #Big Data #Scrum #Programming #Logging #Apache Kafka #Zookeeper #GIT #API (Application Programming Interface) #Computer Science #Agile #Automation #DevOps #Data Engineering #Scala #Security #Hadoop #"ETL (Extract #Transform #Load)" #Monitoring #Prometheus #Kubernetes #Microservices #Jenkins #AWS (Amazon Web Services) #Data Integrity #Data Pipeline #Compliance #Kafka (Apache Kafka) #Java #Grafana
Role description
Key Required Skills:
Expertise in the development, testing, and production support of Confluent Kafka-based systems. This role requires deep expertise in Kafka architecture, including Confluent Control Center, Kafka Streams, and Kafka Connect. The engineer will collaborate closely with cross-functional teams to ensure the smooth operation of data streaming services.
Position Description:
• Design Confluent Kafka cluster environments, configure and manage Kafka instances, and monitor system performance.
• Ensure data integrity and availability in a big data environment.
• Expertise in a programming language, such as Java or Python.
• Collaborate with product design teams and SMEs to understand data pipeline needs.
• Participate in all Agile ceremonies.
Skills Requirements:
• 10+ years of experience in a Software Development field.
• Software development experience with a solid understanding of building, deploying, and maintaining applications that leverage the Confluent Kafka platform, focusing on data streaming and messaging solutions.
• Bachelor's degree in computer science, Information Technology, or a related field.
• Master's or Doctorate degree may substitute for required experience.
• 5+ years of experience on an Agile development team
• Must be able to obtain and maintain a Public Trust. Contract requirement.
• Must be able to work on-site in Woodlawn, MD 5 days a week.
Required Skills
• Extensive experience with Apache Kafka and Confluent Kafka, including proficiency with Confluent Control Center, KSQLDB, Kafka Streams, and Kafka Connect.
• Proven experience in Kafka development, including producer and consumer API, stream processing, and connector development.
• Experience with Kafka cluster management, including setup, configuration, monitoring, and troubleshooting.
• Familiarity with distributed systems, microservices architecture, and event-driven design patterns.
• Experience with AWS and containerization (Kubernetes) is a plus.
• Proficiency in programming languages, such as Java.
• Strong knowledge of Kafka internals, including brokers, zookeepers, topics, partitions, and offsets.
• Experience with monitoring tools (e.g., Prometheus, Grafana) and logging frameworks (e.g., Log4j.
• Proficiency in using Confluent Control Center for monitoring, managing, and optimizing Kafka clusters.
• Expertise in Kafka Streams for building scalable, fault-tolerant stream processing applications.
• Experience with KSQLDB for real-time processing and analytics on Kafka topics.
• Strong understanding of Kafka Connect for integrating Kafka with external data sources and sinks.
• Understanding of networking, security, and compliance aspects related to Kafka.
• Familiarity with CI/CD pipelines and automation tools (e.g., Jenkins, Git).
• Write and maintain high-quality code for Kafka producers, consumers, and stream processing applications.
• Develop and manage Kafka connectors for seamless integration with external systems, ensuring data consistency and reliability.
• Utilize Kafka Streams for real-time processing of streaming data, transforming and enriching data as it flows through the pipeline.
• Employ KSQLDB for stream processing tasks, including real-time analytics and transformations.
• Collaborate with data engineers, software developers, and DevOps teams to integrate Kafka solutions with existing systems.
• Ensure all Kafka-based solutions are scalable, secure, and optimized for performance.
• Troubleshoot and resolve issues related to Kafka performance, latency, and data integrity, including issues specific to Kafka Streams, KSQLDB, and Kafka Connect.
Desired Skills
• Experience in an AWS environment.
• Experience with Hadoop or other big data platform.
• Excellent troubleshooting and analytical skills to quickly identify and resolve issues.
• Proficiency in Software development, preferably Java.
• Experience working on Agile projects and understanding Agile terminology.
• Participate in daily scrum and provide updates.
Job Types: Full-time, Contract
Pay: $85.00 - $90.00 per hour
Work Location: In person
Key Required Skills:
Expertise in the development, testing, and production support of Confluent Kafka-based systems. This role requires deep expertise in Kafka architecture, including Confluent Control Center, Kafka Streams, and Kafka Connect. The engineer will collaborate closely with cross-functional teams to ensure the smooth operation of data streaming services.
Position Description:
• Design Confluent Kafka cluster environments, configure and manage Kafka instances, and monitor system performance.
• Ensure data integrity and availability in a big data environment.
• Expertise in a programming language, such as Java or Python.
• Collaborate with product design teams and SMEs to understand data pipeline needs.
• Participate in all Agile ceremonies.
Skills Requirements:
• 10+ years of experience in a Software Development field.
• Software development experience with a solid understanding of building, deploying, and maintaining applications that leverage the Confluent Kafka platform, focusing on data streaming and messaging solutions.
• Bachelor's degree in computer science, Information Technology, or a related field.
• Master's or Doctorate degree may substitute for required experience.
• 5+ years of experience on an Agile development team
• Must be able to obtain and maintain a Public Trust. Contract requirement.
• Must be able to work on-site in Woodlawn, MD 5 days a week.
Required Skills
• Extensive experience with Apache Kafka and Confluent Kafka, including proficiency with Confluent Control Center, KSQLDB, Kafka Streams, and Kafka Connect.
• Proven experience in Kafka development, including producer and consumer API, stream processing, and connector development.
• Experience with Kafka cluster management, including setup, configuration, monitoring, and troubleshooting.
• Familiarity with distributed systems, microservices architecture, and event-driven design patterns.
• Experience with AWS and containerization (Kubernetes) is a plus.
• Proficiency in programming languages, such as Java.
• Strong knowledge of Kafka internals, including brokers, zookeepers, topics, partitions, and offsets.
• Experience with monitoring tools (e.g., Prometheus, Grafana) and logging frameworks (e.g., Log4j.
• Proficiency in using Confluent Control Center for monitoring, managing, and optimizing Kafka clusters.
• Expertise in Kafka Streams for building scalable, fault-tolerant stream processing applications.
• Experience with KSQLDB for real-time processing and analytics on Kafka topics.
• Strong understanding of Kafka Connect for integrating Kafka with external data sources and sinks.
• Understanding of networking, security, and compliance aspects related to Kafka.
• Familiarity with CI/CD pipelines and automation tools (e.g., Jenkins, Git).
• Write and maintain high-quality code for Kafka producers, consumers, and stream processing applications.
• Develop and manage Kafka connectors for seamless integration with external systems, ensuring data consistency and reliability.
• Utilize Kafka Streams for real-time processing of streaming data, transforming and enriching data as it flows through the pipeline.
• Employ KSQLDB for stream processing tasks, including real-time analytics and transformations.
• Collaborate with data engineers, software developers, and DevOps teams to integrate Kafka solutions with existing systems.
• Ensure all Kafka-based solutions are scalable, secure, and optimized for performance.
• Troubleshoot and resolve issues related to Kafka performance, latency, and data integrity, including issues specific to Kafka Streams, KSQLDB, and Kafka Connect.
Desired Skills
• Experience in an AWS environment.
• Experience with Hadoop or other big data platform.
• Excellent troubleshooting and analytical skills to quickly identify and resolve issues.
• Proficiency in Software development, preferably Java.
• Experience working on Agile projects and understanding Agile terminology.
• Participate in daily scrum and provide updates.
Job Types: Full-time, Contract
Pay: $85.00 - $90.00 per hour
Work Location: In person






