bentoml/Yatai
Model Deployment at Scale on Kubernetes 🦄️
This tool helps DevOps teams integrate Machine Learning models into their existing infrastructure. It takes trained BentoML models and deploys them as scalable services on Kubernetes clusters. The output is a running, managed ML service ready for production use, enabling seamless integration of ML into existing GitOps workflows.
838 stars. No commits in the last 6 months.
Use this if you are a DevOps engineer or MLOps practitioner needing to deploy and manage BentoML-packaged machine learning models on a Kubernetes cluster with CI/CD and GitOps practices.
Not ideal if you are looking for a standalone machine learning model training platform or a simple way to deploy models without using Kubernetes.
Stars
838
Forks
76
Language
TypeScript
License
—
Category
Last pushed
May 08, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/mlops/bentoml/Yatai"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
apache/airflow
Apache Airflow - A platform to programmatically author, schedule, and monitor workflows
mlrun/mlrun
MLRun is an open source MLOps platform for quickly building and managing continuous ML...
clearml/clearml
ClearML - Auto-Magical CI/CD to streamline your AI workload. Experiment Management, Data...
argoproj-labs/hera
Hera makes Python code easy to orchestrate on Argo Workflows through native Python integrations....
argoproj/argo-workflows
Workflow Engine for Kubernetes