Python Developer (W2 Role)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Python Developer (W2 Role) with a 12+ month contract, offering competitive pay. Key skills include Python, GCP, OpenShift, Kubernetes, and experience with AI models. Extensive experience in deploying Generative AI models and data processing frameworks is essential.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 22, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
Unknown
📄 - Contract type
W2 Contractor
🔒 - Security clearance
Unknown
📍 - Location detailed
Charlotte, NC
🧠 - Skills detailed
#Storage #AI (Artificial Intelligence) #Airflow #Kubernetes #S3 (Amazon Simple Storage Service) #Cloud #PySpark #Python #Security #Spark (Apache Spark) #Azure #Data Processing #Model Optimization #Deployment #Scala #Google Cloud Storage #GCP (Google Cloud Platform)
Role description

Job Title: Python Developer with OpenShift

Duration: 12+ Months with possible extension/conversion

Job Overview:

We are seeking a highly skilled Senior Python Engineer / AIML-LLM Engineer to join our innovative AI team. The ideal candidate will possess deep experience in Python development, cloud-native architectures, and deploying Generative AI models at scale. You will work on cutting-edge projects involving LLMs such as LLaMA, Mistral, and other foundation models, contributing to real-time data streaming, model optimization, and scalable AI infrastructure.

Candidates need to have:

   • Proven experience as a Senior Python Engineer/Developer, with a strong portfolio of relevant projects.

   • Extensive experience working with Google Cloud Platform (GCP) is required; Azure experience is not applicable.

   • Hands-on experience with GPU clusters, particularly in deploying AI models.

   • Proficiency in containerization technologies such as OpenShift and Kubernetes.

   • Understanding of data processing frameworks, specifically PySpark and Airflow.

   • Familiarity with object storage solutions (S3 and Google Cloud Storage).

   • Excellent problem-solving skills and ability to work independently as well as part of a team.

Job Responsibilities:

   • Develop, test, and deploy Python applications in cloud environments, specifically GCP and OpenShift.

   • Design and implement solutions for deploying and managing LLM models on GPU clusters.

   • Collaborate with cross-functional teams to integrate systems and optimize cloud resources.

   • Work with containerization technologies such as Kubernetes for effective deployment and scaling of applications.

   • Utilize Python, PySpark, and Airflow for data processing and orchestration.

   • Manage object storage solutions like S3 and Google Cloud Storage.

   • Ensure best practices for cloud security, performance, and cost management.

   • Solve complex technical challenges and contribute to the architecture of scalable and reliable systems