

Python Developer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Python Developer with 8-10 years of experience, focusing on AWS, Kafka, and Kubernetes. It offers a hybrid work location in Detroit/Charlotte or Austin, TX, with a competitive pay rate. Key skills include API development and Terraform expertise.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 9, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Hybrid
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Austin, Texas Metropolitan Area
-
π§ - Skills detailed
#Database Systems #Cloud #Deployment #Redis #AWS (Amazon Web Services) #MS SQL (Microsoft SQL Server) #DevOps #Kubernetes #Scala #Data Pipeline #Data Processing #API (Application Programming Interface) #Terraform #SQL (Structured Query Language) #Security #GCP (Google Cloud Platform) #Apache Kafka #GitLab #Azure #REST (Representational State Transfer) #Prometheus #Microservices #Zookeeper #REST API #Automated Testing #Grafana #NoSQL #Python #Docker #Kafka (Apache Kafka)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
β’ Role β Sr. Developer
β’ 8-10 yrs of exp.
β’ Primary Skills - Python, AWS and API Build exp
β’ Secondary skills - Terraform, GitLab, Apigee and REST API.
β’ Hybrid (Detroit/Charlotte - 3 days to office)/Austin,TX
Python Developer with expertise in Apache Kafka and Kubernetes to design, deploy, and optimize high-throughput data pipelines and microservices. Youβll architect event-driven systems, ensure seamless container orchestration, and collaborate with cross-functional teams to deliver robust solutions.
Key Responsibilities
Design, develop, and maintain Python-based microservices and event-driven applications.
Build and manage Kafka producers/consumers, topics, and streams for real-time data processing.
Containerize applications using Docker and orchestrate deployments via Kubernetes (EKS/GKE/AKS).
Implement CI/CD pipelines (e.g., GitLab CI, ArgoCD) for automated testing and deployment.
Monitor system performance using tools like Prometheus/Grafana and troubleshoot issues in distributed environments.
Collaborate with DevOps/SRE teams to enhance scalability, security, and reliability of cloud infrastructure (AWS/Azure/GCP).
Required Qualifications
9+ years of Python development experience
4+ years hands-on experience with Apache Kafka (including brokers, ZooKeeper, Kafka Streams/KSQL).
5+ years deploying and managing containerized apps on Kubernetes (helm charts, operators, CRDs).
Proficiency in cloud platforms (AWS, Azure, or GCP) and infrastructure-as-code (Terraform/CloudFormation).
Experience with database systems (SQL/NoSQL) and caching solutions (Redis).
Strong understanding of distributed systems, event sourcing, and streaming architectures.