

Acumenz Consulting
Senior MLOps Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior MLOps Engineer in Reading, Pennsylvania, with a contract length of unspecified duration and a pay rate of "unknown." Key skills include Dataiku, AWS, CI/CD, and agent-based system development. A Bachelor's degree and relevant experience are required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
February 21, 2026
π - Duration
Unknown
-
ποΈ - Location
On-site
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Reading, PA
-
π§ - Skills detailed
#Databases #API (Application Programming Interface) #Kubernetes #IAM (Identity and Access Management) #Cloud #Docker #Lambda (AWS Lambda) #Grafana #S3 (Amazon Simple Storage Service) #Data Science #Data Governance #AI (Artificial Intelligence) #ML Ops (Machine Learning Operations) #Computer Science #DevOps #Observability #Strategy #GIT #ML (Machine Learning) #Deployment #OpenSearch #Indexing #DynamoDB #A/B Testing #AWS (Amazon Web Services) #SageMaker #Dataiku
Role description
Job Title - Sr, ML Ops Engineer
Location- Reading, Pennsylvania, Work from Client location, 5 days a week
Must have
Looking for a pure MLOps Engineer with hands-on experience in Dataiku (Sage Mager is plus).
Responsibilities
β’ Design multi-agent architectures: define agent roles (planner, researcher, retriever, executor, reviewer), toolboxes, handoffs, memory strategy (short/long-term), and supervisor policies for safe collaboration.
β’ Build high-quality RAG: implement ingestion, chunking, embeddings, indexing, and retrieval with evaluation (precision/recall, groundedness, hallucination checks), guardrails, and citations.
β’ Productionize on AWS: leverage services like Bedrock (Agents/Knowledge Bases/Flows), Lambda, API Gateway, S3, DynamoDB, OpenSearch/Vector DB, Step Functions, and CloudWatch for tracing and alerts.
β’ MLOps/LLMOps: automate CI/CD (GitOps), containerization (Docker/Kubernetes), infra-as-code, secrets/IAM, blue green/rollbacks, and data/feature pipelines.
β’ Observability & evaluation: instrument telemetry (traces, token/cost, latency), build dashboards (Grafana/CloudWatch), add human-in-the-loop review, A/B testing, and continuous offline/online evals.
β’ Operate reliably at scale: implement caching, rate-limit management, queueing, idempotency, and backoff; proactively detect drift and degradation.
β’ Collaborate & communicate partner with infra/DevOps/data/architecture teams; document designs, SLIs/SLOs, runbooks; present status and insights to technical and non-technical stakeholders.
Qualifications we seek in you!
Minimum Qualifications
β’ Bachelor's degree in computer science, Data Science, Engineering, or related fieldβor equivalent experience.
β’ Proven experience building agentic systems (single or multi-agent) and RAG pipelines in production.
β’ Strong cloud background for AI/ML workloads; familiarity with Bedrock or equivalent LLM platforms.
β’ Solid CI/CD and containerization skills (Git, Docker, Kubernetes) and infra-as-code fundamentals.
β’ Knowledge of data governance and model accountability throughout the MLOps/LLMOps lifecycle.
β’ Excellent communication, collaboration, and problem-solving skills; ability to work independently and within cross-functional teams.
β’ Passion for Generative AI and the impact of agent-based solutions across industries.
Preferred / Good to Have
β’ Experience with AWS Bedrock Agents/Knowledge Bases/Flows, OpenSearch (or other vector databases), Step Functions, Lambda, API Gateway, DynamoDB, S3.
β’ Dataiku platform exposureβgovern, approvals, artifacts, MLOps deployment flows; SageMaker for custom model hosting.
β’ Familiarity with agent frameworks (e.g., LangGraph, crewAI, Semantic Kernel, AutoGen) and evaluation frameworks (guardrails, groundedness, hallucination checks).
β’ Covered these Dataiku Certifications (nice to have): ML Practitioner, Advanced Designer, MLOps Practitioner.
Job Title - Sr, ML Ops Engineer
Location- Reading, Pennsylvania, Work from Client location, 5 days a week
Must have
Looking for a pure MLOps Engineer with hands-on experience in Dataiku (Sage Mager is plus).
Responsibilities
β’ Design multi-agent architectures: define agent roles (planner, researcher, retriever, executor, reviewer), toolboxes, handoffs, memory strategy (short/long-term), and supervisor policies for safe collaboration.
β’ Build high-quality RAG: implement ingestion, chunking, embeddings, indexing, and retrieval with evaluation (precision/recall, groundedness, hallucination checks), guardrails, and citations.
β’ Productionize on AWS: leverage services like Bedrock (Agents/Knowledge Bases/Flows), Lambda, API Gateway, S3, DynamoDB, OpenSearch/Vector DB, Step Functions, and CloudWatch for tracing and alerts.
β’ MLOps/LLMOps: automate CI/CD (GitOps), containerization (Docker/Kubernetes), infra-as-code, secrets/IAM, blue green/rollbacks, and data/feature pipelines.
β’ Observability & evaluation: instrument telemetry (traces, token/cost, latency), build dashboards (Grafana/CloudWatch), add human-in-the-loop review, A/B testing, and continuous offline/online evals.
β’ Operate reliably at scale: implement caching, rate-limit management, queueing, idempotency, and backoff; proactively detect drift and degradation.
β’ Collaborate & communicate partner with infra/DevOps/data/architecture teams; document designs, SLIs/SLOs, runbooks; present status and insights to technical and non-technical stakeholders.
Qualifications we seek in you!
Minimum Qualifications
β’ Bachelor's degree in computer science, Data Science, Engineering, or related fieldβor equivalent experience.
β’ Proven experience building agentic systems (single or multi-agent) and RAG pipelines in production.
β’ Strong cloud background for AI/ML workloads; familiarity with Bedrock or equivalent LLM platforms.
β’ Solid CI/CD and containerization skills (Git, Docker, Kubernetes) and infra-as-code fundamentals.
β’ Knowledge of data governance and model accountability throughout the MLOps/LLMOps lifecycle.
β’ Excellent communication, collaboration, and problem-solving skills; ability to work independently and within cross-functional teams.
β’ Passion for Generative AI and the impact of agent-based solutions across industries.
Preferred / Good to Have
β’ Experience with AWS Bedrock Agents/Knowledge Bases/Flows, OpenSearch (or other vector databases), Step Functions, Lambda, API Gateway, DynamoDB, S3.
β’ Dataiku platform exposureβgovern, approvals, artifacts, MLOps deployment flows; SageMaker for custom model hosting.
β’ Familiarity with agent frameworks (e.g., LangGraph, crewAI, Semantic Kernel, AutoGen) and evaluation frameworks (guardrails, groundedness, hallucination checks).
β’ Covered these Dataiku Certifications (nice to have): ML Practitioner, Advanced Designer, MLOps Practitioner.






