

BuzzClan
Machine Learning Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Machine Learning Engineer with a contract length of "unknown," offering a pay rate of "unknown," and is remote. Key skills include MLOps, Dataiku, AWS services, CI/CD, and agentic systems. A Bachelor's degree and relevant experience are required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
February 11, 2026
π - Duration
Unknown
-
ποΈ - Location
Unknown
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Reading, PA
-
π§ - Skills detailed
#AI (Artificial Intelligence) #Indexing #GIT #API (Application Programming Interface) #Computer Science #DevOps #Lambda (AWS Lambda) #OpenSearch #S3 (Amazon Simple Storage Service) #A/B Testing #Observability #DynamoDB #Docker #Data Governance #ML (Machine Learning) #Kubernetes #Dataiku #AWS (Amazon Web Services) #Data Science #IAM (Identity and Access Management) #Cloud #Databases #Deployment #Strategy #SageMaker #Grafana
Role description
β’ Looking for a pure MLOps Engineer with hands-on experience in Dataiku (Sage Mager is plus).
Responsibilities
β’ β’ Design multi-agent architectures: define agent roles (planner, researcher, retriever, executor, reviewer), toolboxes, handoffs, memory strategy (short/long-term), and supervisor policies for safe collaboration.
β’ β’ Build high-quality RAG: implement ingestion, chunking, embeddings, indexing, and retrieval with evaluation (precision/recall, groundedness, hallucination checks), guardrails, and citations.
β’ β’ Productionize on AWS: leverage services like Bedrock (Agents/Knowledge Bases/Flows), Lambda, API Gateway, S3, DynamoDB, OpenSearch/Vector DB, Step Functions, and CloudWatch for tracing and alerts.
β’ β’ MLOps/LLMOps: automate CI/CD (GitOps), containerization (Docker/Kubernetes), infra-as-code, secrets/IAM, blue green/rollbacks, and data/feature pipelines.
β’ β’ Observability & evaluation: instrument telemetry (traces, token/cost, latency), build dashboards (Grafana/CloudWatch), add human-in-the-loop review, A/B testing, and continuous offline/online evals.
β’ β’ Operate reliably at scale: implement caching, rate-limit management, queueing, idempotency, and backoff; proactively detect drift and degradation.
β’ β’ Collaborate & communicate partner with infra/DevOps/data/architecture teams; document designs, SLIs/SLOs, runbooks; present status and insights to technical and non-technical stakeholders.
Minimum Qualifications
β’ β’ Bachelor's degree in computer science, Data Science, Engineering, or related fieldβor equivalent experience.
β’ β’ Proven experience building agentic systems (single or multi-agent) and RAG pipelines in production.
β’ β’ Strong cloud background for AI/ML workloads; familiarity with Bedrock or equivalent LLM platforms.
β’ β’ Solid CI/CD and containerization skills (Git, Docker, Kubernetes) and infra-as-code fundamentals.
β’ β’ Knowledge of data governance and model accountability throughout the MLOps/LLMOps lifecycle.
β’ β’ Excellent communication, collaboration, and problem-solving skills; ability to work independently and within cross-functional teams.
β’ β’ Passion for Generative AI and the impact of agent-based solutions across industries.
Preferred / Good to Have
β’ β’ Experience with AWS Bedrock Agents/Knowledge Bases/Flows, OpenSearch (or other vector databases), Step Functions, Lambda, API Gateway, DynamoDB, S3.
β’ β’ Dataiku platform exposureβgovern, approvals, artifacts, MLOps deployment flows; SageMaker for custom model hosting.
β’ β’ Familiarity with agent frameworks (e.g., LangGraph, crewAI, Semantic Kernel, AutoGen) and evaluation frameworks (guardrails, groundedness, hallucination checks).
β’ β’ Covered these Dataiku Certifications (nice to have): ML Practitioner, Advanced Designer, MLOps Practitioner.
β’ Looking for a pure MLOps Engineer with hands-on experience in Dataiku (Sage Mager is plus).
Responsibilities
β’ β’ Design multi-agent architectures: define agent roles (planner, researcher, retriever, executor, reviewer), toolboxes, handoffs, memory strategy (short/long-term), and supervisor policies for safe collaboration.
β’ β’ Build high-quality RAG: implement ingestion, chunking, embeddings, indexing, and retrieval with evaluation (precision/recall, groundedness, hallucination checks), guardrails, and citations.
β’ β’ Productionize on AWS: leverage services like Bedrock (Agents/Knowledge Bases/Flows), Lambda, API Gateway, S3, DynamoDB, OpenSearch/Vector DB, Step Functions, and CloudWatch for tracing and alerts.
β’ β’ MLOps/LLMOps: automate CI/CD (GitOps), containerization (Docker/Kubernetes), infra-as-code, secrets/IAM, blue green/rollbacks, and data/feature pipelines.
β’ β’ Observability & evaluation: instrument telemetry (traces, token/cost, latency), build dashboards (Grafana/CloudWatch), add human-in-the-loop review, A/B testing, and continuous offline/online evals.
β’ β’ Operate reliably at scale: implement caching, rate-limit management, queueing, idempotency, and backoff; proactively detect drift and degradation.
β’ β’ Collaborate & communicate partner with infra/DevOps/data/architecture teams; document designs, SLIs/SLOs, runbooks; present status and insights to technical and non-technical stakeholders.
Minimum Qualifications
β’ β’ Bachelor's degree in computer science, Data Science, Engineering, or related fieldβor equivalent experience.
β’ β’ Proven experience building agentic systems (single or multi-agent) and RAG pipelines in production.
β’ β’ Strong cloud background for AI/ML workloads; familiarity with Bedrock or equivalent LLM platforms.
β’ β’ Solid CI/CD and containerization skills (Git, Docker, Kubernetes) and infra-as-code fundamentals.
β’ β’ Knowledge of data governance and model accountability throughout the MLOps/LLMOps lifecycle.
β’ β’ Excellent communication, collaboration, and problem-solving skills; ability to work independently and within cross-functional teams.
β’ β’ Passion for Generative AI and the impact of agent-based solutions across industries.
Preferred / Good to Have
β’ β’ Experience with AWS Bedrock Agents/Knowledge Bases/Flows, OpenSearch (or other vector databases), Step Functions, Lambda, API Gateway, DynamoDB, S3.
β’ β’ Dataiku platform exposureβgovern, approvals, artifacts, MLOps deployment flows; SageMaker for custom model hosting.
β’ β’ Familiarity with agent frameworks (e.g., LangGraph, crewAI, Semantic Kernel, AutoGen) and evaluation frameworks (guardrails, groundedness, hallucination checks).
β’ β’ Covered these Dataiku Certifications (nice to have): ML Practitioner, Advanced Designer, MLOps Practitioner.






