Talent Groups

GPU Inference Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GPU Inference Engineer on a contract basis, remote location, requiring deep experience in cloud services and distributed systems, particularly with LLMs. Key skills include Kubernetes, Docker, CI/CD, and strong communication abilities.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 28, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Deployment #Docker #Cloud #Kubernetes #Storage #Logging #Documentation #Infrastructure as Code (IaC) #Monitoring #Data Storage
Role description
Role: GPU Inference Engineer Location: Remote Duration: Contract Job Description: Dedicated Inference Service • We are now looking for devs with general cloud services / distributed services experience, with LLM experience as a secondary skill. Required Skills • Deep experience building services in modern cloud environments on distributed systems (i.e., containerization (Kubernetes, Docker), infrastructure as code, CI/CD pipelines, APIs, authentication and authorization, data storage, deployment, logging, monitoring, alerting, etc.) • Experience working with Large Language Models (LLMs), particularly hosting them to run inference • Strong verbal and written communication skills. Your job will involve communicating with local and remote colleagues about technical subjects and writing detailed documentation.