LanceSoft, Inc.

Kafka Developer with AWS

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Kafka Developer with AWS expertise, offering a 6+ month contract in Tampa, FL (Hybrid/Onsite) at a pay rate of $48/hr. on W2 or $53/hr. on C2C. Key skills include Kafka architecture, AWS services, and programming in Java, Scala, or Python.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
424
-
πŸ—“οΈ - Date
October 9, 2025
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
W2 Contractor
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Tampa, FL
-
🧠 - Skills detailed
#Scala #Lambda (AWS Lambda) #S3 (Amazon Simple Storage Service) #Web Services #Data Engineering #Deployment #Java #NoSQL #VPC (Virtual Private Cloud) #Terraform #Monitoring #Python #Cloud #Security #Kafka (Apache Kafka) #Jenkins #Programming #IAM (Identity and Access Management) #Data Integration #Data Pipeline #EC2 #Data Encryption #AWS (Amazon Web Services) #Apache Kafka #Databases #GitLab
Role description
β€’ β€’ β€’ β€’ Interview Type: In-Person β€’ β€’ β€’ β€’ Role: Kafka Developer with AWS Location: Tampa, FL (Hybrid/Onsite) Interview Type: In-Person Hiring Mode: Contract Duration: 6+ Months Pay Rate: $48/hr. on W2 / $53/Hr. on C2C Description: β€’ A Kafka Developer with AWS expertise is responsible for designing, developing, and maintaining real-time data streaming solutions leveraging Apache Kafka within the Amazon Web Services (AWS) cloud environment. Responsibilities: β€’ Design, implement, and maintain Kafka producers, consumers, and stream processing applications using languages like Java, Scala, or Python. β€’ Deploy, manage, and optimize Kafka clusters and related applications on AWS services such as Amazon Managed Streaming for Apache Kafka (MSK), EC2, S3, Lambda, and CloudWatch. β€’ Develop and manage end-to-end data pipelines involving Kafka Connect, Kafka Streams, and other data integration tools. β€’ Ensure the performance, scalability, and reliability of Kafka-based systems, including cluster tuning, monitoring, and troubleshooting. β€’ Implement security best practices for Kafka on AWS, including authentication, authorization (ACLs), and data encryption. β€’ Utilize infrastructure-as-code tools (e.g., Terraform, CloudFormation) and CI/CD pipelines (e.g., Jenkins, GitLab CI) for efficient deployment and management β€’ Work closely with data engineers, architects, and other development teams to understand requirements and deliver robust streaming solutions. Required: β€’ Deep understanding of Kafka architecture, concepts (topics, partitions, brokers), and related tools (Kafka Connect, Kafka Streams, Schema Registry). β€’ Strong experience with relevant AWS services for data streaming and infrastructure management (e.g., MSK, EC2, S3, CloudWatch, IAM, VPC). β€’ Expertise in one or more programming languages commonly used with Kafka, such as Java, Scala, or Python. β€’ Knowledge of distributed systems principles and experience building scalable, fault-tolerant applications. β€’ Familiarity with relational or NoSQL databases for data persistence and integration. β€’ Strong analytical and problem-solving skills to diagnose and resolve issues in complex distributed environments. β€’ Excellent communication and collaboration skills to work effectively within a team and with stakeholders.