

TEKFORTUNE INC
Machine Learning Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Machine Learning Engineer with a contract length of "X months" and a pay rate of "$X/hour". Requires 3+ years in backend systems, production AI systems experience, proficiency in Python, and deep cloud infrastructure knowledge.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 21, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Austin, TX
-
🧠 - Skills detailed
#Deployment #GCP (Google Cloud Platform) #Kubernetes #Regression #Batch #TypeScript #AI (Artificial Intelligence) #Docker #Cloud #ML (Machine Learning) #AWS (Amazon Web Services) #Observability #Python #API (Application Programming Interface) #Azure
Role description
Requirement Details
Backend/Systems Experience 3+ years building production backend or distributed systems (pre-AI experience required)
Production AI Systems Has shipped AI/LLM features serving real users at scale — not just prototypes or demos
Agentic Systems Has built AI agents, skills, tools, or MCP (Model Context Protocol) integrations
Python Proficient for backend development
Secondary Language Working knowledge of Go, TypeScript, or Rust
Cloud Infrastructure Deep experience with AWS/GCP/Azure — cost optimization, compute decisions, not just deployment
Container & Orchestration Hands-on with Docker and Kubernetes — can build, deploy, debug, and scale services themselves
LLM Integration Understands token economics, context limits, rate limiting, structured outputs, API failure modes
LLM Evaluation Understands how to evaluate LLM outputs and the inherent challenges (non-determinism, quality measurement, regression detection)
Hands-On Engineer Not just an architect — writes code, debugs production issues, deploys their own work
\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_
Preferred / Differentiators
• Built multi-step agentic workflows with tool use and function calling
• Experience with agent orchestration frameworks (LangGraph, CrewAI, or custom)
• Built guardrails, fallbacks, or graceful degradation for AI systems
• Streaming inference and async agent orchestration
• Cost/latency optimization: caching, batching, prompt compression
• ML observability tools: Langfuse, Arize, Braintrust, W&B
• Retrieval systems (vector search, hybrid search) — as a tool, not the focus
Requirement Details
Backend/Systems Experience 3+ years building production backend or distributed systems (pre-AI experience required)
Production AI Systems Has shipped AI/LLM features serving real users at scale — not just prototypes or demos
Agentic Systems Has built AI agents, skills, tools, or MCP (Model Context Protocol) integrations
Python Proficient for backend development
Secondary Language Working knowledge of Go, TypeScript, or Rust
Cloud Infrastructure Deep experience with AWS/GCP/Azure — cost optimization, compute decisions, not just deployment
Container & Orchestration Hands-on with Docker and Kubernetes — can build, deploy, debug, and scale services themselves
LLM Integration Understands token economics, context limits, rate limiting, structured outputs, API failure modes
LLM Evaluation Understands how to evaluate LLM outputs and the inherent challenges (non-determinism, quality measurement, regression detection)
Hands-On Engineer Not just an architect — writes code, debugs production issues, deploys their own work
\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_\_
Preferred / Differentiators
• Built multi-step agentic workflows with tool use and function calling
• Experience with agent orchestration frameworks (LangGraph, CrewAI, or custom)
• Built guardrails, fallbacks, or graceful degradation for AI systems
• Streaming inference and async agent orchestration
• Cost/latency optimization: caching, batching, prompt compression
• ML observability tools: Langfuse, Arize, Braintrust, W&B
• Retrieval systems (vector search, hybrid search) — as a tool, not the focus






