Llm Inference Serving MLOps Tools

There are 8 llm inference serving tools tracked. 3 score above 70 (verified tier). The highest-rated is nndeploy/nndeploy at 76/100 with 1,762 stars. 3 of the top 10 are actively maintained.

Get all 8 projects as JSON

curl "https://pt-edge.onrender.com/api/v1/datasets/quality?domain=mlops&subcategory=llm-inference-serving&limit=20"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.

# Tool Score Tier
1 nndeploy/nndeploy

一款简单易用和高性能的AI部署框架 | An Easy-to-Use and High-Performance AI Deployment Framework

76
Verified
2 bentoml/BentoML

The easiest way to serve AI apps and models - Build Model Inference APIs,...

76
Verified
3 kubeflow/trainer

Distributed AI Model Training and LLM Fine-Tuning on Kubernetes

71
Verified
4 cncf/llm-in-action

🤖 Discover how to apply your LLM app skills on Kubernetes!

46
Emerging
5 llmcloud24/de.KCD-Summer-School-2024

Learn how to deploy your own LLM in the de.NBI cloud via a step-by-step...

33
Emerging
6 ray-project/llms-in-prod-workshop-2023

Deploy and Scale LLM-based applications

33
Emerging
7 SohamGovande/podplex

🦾💻🌐 distributed training & serverless inference at scale on RunPod

26
Experimental
8 ArslanKAS/Serverless-LLM-Amazon-Bedrock

You’ll learn how to deploy a large language model-based application into...

24
Experimental