Confluent Kafka SME

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Confluent Kafka SME with strong deployment experience on Azure Kubernetes and helm charts. The contract lasts 6-12 weeks, offering competitive pay. Key skills include Kafka configuration, OAuth implementation, disaster recovery, and monitoring automation.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
909
🗓️ - Date discovered
May 17, 2025
🕒 - Project duration
3 to 6 months
🏝️ - Location type
Unknown
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Atlanta, GA
🧠 - Skills detailed
#Monitoring #Storage #Documentation #Vault #Replication #Deployment #Kafka (Apache Kafka) #YAML (YAML Ain't Markup Language) #Automation #Cloud #Azure #Security #Kubernetes #Scala #Disaster Recovery #"ETL (Extract #Transform #Load)"
Role description
Confluent Kafka SME Redapt Inc. is a pioneering world-class data center infrastructure integrator, technology engineering firm, and cloud services provider. Our teams focus on delivering innovative solutions and services that power our customers' most demanding applications and enable them to extract powerful insights from data that drive true business value. Redapt is looking for a Confluent Kafka SME with deployment experience on Kubernetes. This is a hands-on position that will work alongside the team. This position requires someone to have strong Confluent Kafka deployment and configuration experience on Azure Kubernetes with deployments via helm charts. Project Details: • This project will be hands on with configuring Confluent Kafka in Kubernetes for roughly a 6-12-week engagement. Project Outline: Phase 1: Foundation and Core Implementation Environment Onboarding & Planning • Complete architecture walkthrough with team • Finalize access provisioning and environment familiarization • Establish implementation priorities and approval workflows • Begin preparation for OAuth implementation OAuth Implementation • Configure OAuth integration with identity provider (entra-id/Azure) • Implement token issuance and validation mechanisms • Deploy and test OAuth configuration in development environment • Troubleshoot and optimize authentication flows • Buffer time for unexpected configuration challenges Disaster Recovery & Backup Implementation • Optimize DR configuration to meet 4hr RTO/1hr RPO requirements • Implement and test automated recovery procedures • Deploy monitoring for replication lag and recovery readiness • Perform controlled recovery testing • Buffer time for scenario testing and documentation • Determine Best Practices for Snapshooting/Backup of corrupt data streams (if needed) Phase 2: Enhancement & Optimization Confluent Operator Enhancements • Implement automation for routine operational tasks • Configure and test operator features for self-healing • Optimize operator configuration for environment • Implement custom health checks and alerts • Buffer time for refinement based on initial testing Operational: Monitoring Implementation • Deploy comprehensive metrics collection framework • Create custom dashboards integrated with monitoring • Configure alerting system for critical Kafka metrics • Buffer time for dashboard refinement and alert tuning Operational: Hardware Scalability • Auto scanning of Kafka best practices in CFK • Metrics and alerts to initiate scaling (up or down) • Performance throughput thresholds Phase 3: Certificate Automation • Implement Certificate automation to reduce manual recertification efforts and improve security SecOps • Document YAML configuration, security best practices, storage into secrets management (Azure Keyvault) • Enable MDS support of Microsoft OAuth Phase 4: • Complete knowledge transfer with team • Buffer time for addressing any final implementation issues • Carry-over work as needed