Russell Tobin

Enterprise Kafka Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Enterprise Kafka Architect in McLean, VA, with a contract length of unspecified duration and a pay rate of "unknown." Requires 10+ years of Kafka experience, expertise in cloud platforms, and financial services domain experience is preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
576
-
🗓️ - Date
October 11, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
McLean, VA
-
🧠 - Skills detailed
#GCP (Google Cloud Platform) #Snowflake #Apache Kafka #Terraform #AWS (Amazon Web Services) #Disaster Recovery #Prometheus #Cloud #Data Engineering #DevOps #Azure #Data Integration #Data Processing #Grafana #HADR (High Availability Disaster Recovery) #Security #Python #Scripting #Databricks #Scala #Hadoop #Migration #Kafka (Apache Kafka) #Zookeeper
Role description
we are working with our client for the below New Requirement please have a look at the below Job Description and let me know if you would be having any matching skills for the same. Position- Enterprise Kafka Architect Location- McLean, VA Onsite Job descriptions: The Kafka Architect will be responsible for designing, implementing, and managing scalable, high-performance data streaming solutions using the Apache Kafka ecosystem, with a strong focus on Confluent Platform and Confluent Cloud. The role demands deep expertise in real-time data processing, event-driven architecture, and integration with modern cloud and data platforms. 10 years of experience and above in Kafka architecture and implementation. Deep expertise in Apache, Kafka confluent, Platform Confluent, Cloud Kafka Connect, Kafka Streams, ksqlDB, Zookeeper, KRaft Experience with cloud platforms (AWS, GCP, Azure) and CICD pipelines. Strong understanding of data integration tools (e.g., Snowflake, Databricks, Hadoop). Familiarity with scripting (Shell, Python) and infrastructure-as-code (Terraform). Financial services or credit union domain experience is highly preferred. Architect and implement enterprise-grade Kafka solutions using Confluent Platform cloud. Design and manage Kafka clusters, brokers, topics, and partitions for optimal performance and reliability. Lead migration efforts from legacy messaging systems (e.g., IBM MQ, TIBCO) to Kafka. Develop and optimize Kafka Streams, ksqlDB, Kafka Connect, and Flink-based pipelines. Ensure high availability, disaster recovery, and security (RBAC, ACLs) of Kafka infrastructure. Collaborate with data engineering, DevOps, and application teams to integrate Kafka into broader enterprise systems. Monitor and troubleshoot Kafka environments using tools like Grafana, Prometheus, and Confluent Control Center. Thanks & Regards Nancy Jaswal Lead Recruiter --Technology Recruitment -Eastern Time Zone nancy.jaswal@russelltobin.com https://www.linkedin.com/in/nancy-jaswal-b7255b199/ 420 Lexington Ave, 30th FL. • New York, NY 10170