Mattbusel/tokio-prompt-orchestrator
Multi-core, Tokio-native orchestration for LLM pipelines.
This is for developers building applications that use Large Language Models (LLMs) and need to handle many user requests efficiently and reliably. It takes raw text prompts or conversational turns and routes them through various LLM providers, returning the generated text responses. Developers would use this to ensure their LLM-powered applications are fast, cost-effective, and secure in a production environment.
Use this if you are a developer creating an application that heavily relies on LLMs and needs to manage high volumes of requests, optimize costs, and enhance security.
Not ideal if you are simply experimenting with LLMs or building a small, low-traffic application where advanced orchestration and performance features are not critical.
Stars
50
Forks
4
Language
Rust
License
MIT
Category
Last pushed
Mar 10, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/mcp/Mattbusel/tokio-prompt-orchestrator"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
mcp-use/mcp-use
The fullstack MCP framework to develop MCP Apps for ChatGPT / Claude & MCP Servers for AI Agents.
TencentCloudBase/CloudBase-MCP
CloudBase MCP - Connect CloudBase to your AI Agent. Go from AI prompt to live app.
zhizhuodemao/js-reverse-mcp
为 AI Agent 设计的 JS 逆向 MCP Server,内置反检测,基于 chrome-devtools-mcp 重构 | JS reverse engineering MCP...
juspay/neurolink
Universal AI Development Platform with MCP server integration, multi-provider support, and...
casibase/casibase
⚡️AI Cloud OS: Open-source enterprise-level AI knowledge base and MCP...