MethodHub

Kafka Engineer- W2 (USC/GC/GC EAD)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Mid-Level Kafka Engineer (AWS) with 5–7 years of experience in real-time data streaming. It is a remote position requiring East Coast hours, lasting 6+ months, with expertise in Apache Kafka and AWS services essential.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 7, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#DevOps #Data Governance #Lambda (AWS Lambda) #Jenkins #Disaster Recovery #AWS (Amazon Web Services) #JSON (JavaScript Object Notation) #Infrastructure as Code (IaC) #Compliance #Data Ingestion #Ansible #IAM (Identity and Access Management) #Automation #Scala #EC2 #S3 (Amazon Simple Storage Service) #Data Pipeline #VPC (Virtual Private Cloud) #Cloud #GitHub #Apache Kafka #Terraform #Security #Prometheus #Grafana #Monitoring #Kafka (Apache Kafka)
Role description
Job Title: Mid-Level Kafka Engineer (AWS) - W2 Location: Remote (Must work East Coast hours – NYC preferred) Duration: 6+ Months Job Descriptions Experience Required: 5–7 Years Job Description: We are seeking an experienced Mid-level Kafka Engineer with strong expertise in real-time data streaming and AWS cloud integration. The ideal candidate will be responsible for designing, developing, and maintaining high-performance streaming architectures leveraging Apache Kafka and AWS services. Roles & Responsibilities: • Design, develop, and maintain Kafka-based streaming architectures for real-time data ingestion and processing. • Build and manage Kafka topics, producers, consumers, partitions, and connectors for various data sources and sinks. • Integrate Kafka with AWS services such as EC2, MSK, S3, Lambda, Kinesis, and other cloud components. • Optimize Kafka clusters for scalability, reliability, and performance tuning. • Manage Kafka Streams, Kafka Connect, and Schema Registry (Avro, JSON, Protobuf). • Monitor and troubleshoot Kafka clusters using tools like Confluent Control Center, Grafana, or Prometheus. • Ensure data governance and security compliance using Schema Registry, RBAC, and encryption standards. • Implement infrastructure automation using Terraform, CloudFormation, or other IaC tools. • Collaborate with application and DevOps teams to ensure seamless streaming data pipelines. • Participate in architecture discussions, capacity planning, and disaster recovery strategies. • Provide technical mentorship to junior engineers and assist with Kafka-related troubleshooting. Must Have Skills: 5–7 years of experience in real-time data streaming using Apache Kafka Expertise with Kafka core components: brokers, topics, partitions, producers, consumers Proficiency with Kafka Streams, Kafka Connect, and Schema Registry Strong experience with AWS services (MSK, EC2, S3, Lambda, Glue, CloudWatch, IAM) Hands-on with infrastructure automation tools (Terraform, CloudFormation, Ansible) Familiarity with CI/CD pipelines (Jenkins, GitHub Actions, AWS CodePipeline) Deep understanding of Kafka performance tuning and exactly once delivery semantics Strong analytical and problem-solving skills with experience in monitoring and troubleshooting Kafka clusters Knowledge of networking and security concepts (VPC, IAM, SSL/TLS)