Sr Kafka SME (W2 Candidates Only)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr Kafka SME in Washington, DC, offering a 12-month contract at "pay rate." Requires extensive hands-on Kafka experience, design documentation skills, and knowledge of hybrid multi-cloud environments. W2 candidates only.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
August 19, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
On-site
-
πŸ“„ - Contract type
W2 Contractor
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Washington, DC
-
🧠 - Skills detailed
#Apache Kafka #AI (Artificial Intelligence) #Kubernetes #JSON (JavaScript Object Notation) #BigQuery #NoSQL #BI (Business Intelligence) #Azure #Microsoft Power BI #API (Application Programming Interface) #Compliance #Databases #HADR (High Availability Disaster Recovery) #Replication #JDBC (Java Database Connectivity) #Deployment #Scala #Cloud #Kafka (Apache Kafka) #SQL (Structured Query Language) #Strategy #Disaster Recovery
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Nasscomm, Inc., is seeking the following. Apply via Dice today! Role: Kafka Subject Matter Expert Location: Washington, DC (Onsite 100%) Duration: 12 Months Contract Required Skills: Chosen resource must demonstrate these capabilities through actual work experience not merely training: β€’ Hands-on experience designing and implementing Kafka event streaming capabilities in applications and infrastructure across hybrid multi-cloud environments. β€’ Experience producing IT technical artifacts, with an emphasis on Kafka event streams, including design documents, architecture diagrams, architecture assessments, white papers, test plans, requirements mapping, and implementation plans. β€’ In-depth knowledge of design principles and inner workings of Kafka implementations and applicable use cases for migrating applications from legacy style to modern style with the use of Event Streams and Async APIs. β€’ Demonstrable knowledge of Apache Kafka, Confluent Platform, Confluent Cloud. β€’ Demonstrable knowledge of Apache Strimzi, Confluent for Kubernetes. β€’ Demonstrable knowledge of Cloud services with Kafka API compatibility (e.g. Azure Event Hub, Amazon MSK). β€’ Demonstrable knowledge of Confluent Schema Registry and serialization using JSON and AVRO schemas. β€’ Demonstrable knowledge of Kafka replication options including Mirror Maker, Confluent Cluster Linking and Schema Linking. β€’ Demonstrable knowledge of Confluent Identity Mgmt and RBAC, integrated/federated with enterprise IDP & role management. β€’ Demonstrable knowledge of Kafka Client APIs (Producer, Consumer, Streams). β€’ Demonstrable knowledge of Kafka Connect and Connectors (e.g. JDBC, Cassandra, Google BigQuery, Websphere MQ, etc). β€’ Demonstrable knowledge of sizing and capacity planning for Kafka clusters. β€’ Demonstrable knowledge of Kafka topic partitioning strategies including partition key design strategies. Preferred Skills: β€’ integrating Kafka event streams with Agentic AI workflows. β€’ Using design patterns for building scalable and maintainable applications/solutions. β€’ Clearly document code, models, and technical solutions. β€’ Proficiency in Generative AI and prompt engineering. β€’ Continuous learning and adaptability in a very large IT organization β€’ Communicating complex technical concepts to both technical and executive stakeholders. β€’ Troubleshooting software and technical implementations in large-scale enterprise ecosystems. β€’ Understanding of concepts like CI/CD, containerization, and deployment strategies for Kafka components in large-scale production environments. β€’ Querying and managing data in both SQL and NoSQL databases. β€’ Proficiency creating technical diagrams with products like Microsoft Visio or Draw.io. β€’ Proficiency creating technical design and architecture documents in Microsoft Word. β€’ Proficiency creating business and technical presentations in Microsoft PowerPoint. β€’ Proficiency creating data representations, charts and reports in tools such as Microsoft s Excel worksheets and Power BI. Day to Day Responsibilities: β€’ Analysis support to integrate new Kafka cluster and event stream requirements into established large-scale, production enterprise architectures. β€’ Conduct Design and Architecture assessments and provide written recommendations for integrating Kafka event streams within an application domain or enterprise event brokering. β€’ Troubleshoot production Kafka event streams and software integration issues as top tier internal support, including review of performance issues and proposing resolutions. β€’ Conduct research and provide written reviews of Kafka ecosystem best practices and innovative strategies for hybrid multi-cloud high availability. β€’ Conduct proof of concept activities and build prototypes for Kafka technology stack components in sandbox environments to assess new capabilities. β€’ Define and implement standards and patterns for Kafka ecosystem life cycle, test-driven protection schemes, and automated implementation strategies. β€’ Devise strategies and roadmaps for enterprise expansion of Kafka ecosystem capabilities, integrations, and governance. β€’ Strategy and compliance reviews and recommendations for enterprise Kafka ecosystem elements which may include the following: architecture frameworks, databases, network, web and application architecture resources, backup and recovery, high availability, disaster recovery, patch management and analytics.