postrv/forgemax

Code Mode inspired local sandboxed MCP Gateway - collapses N servers x M tools into 2 tools (~1,000 tokens)

41
/ 100
Emerging

This tool helps AI engineers manage interactions between large language models (LLMs) and many external tools or APIs. It takes a complex set of servers and their tools, and presents them to the LLM as just two simple functions. The LLM receives clear instructions and returns JavaScript code to execute actions, reducing the tokens needed for tool descriptions and improving performance.

135 stars.

Use this if you are building an LLM-powered application that needs to interact with many different external services or APIs and you want to reduce context window usage and improve the LLM's reliability in using these tools.

Not ideal if your LLM application only uses a handful of tools and context window size or tool orchestration complexity is not a primary concern.

LLM-tool-orchestration AI-agent-development API-integration prompt-engineering context-window-optimization
No Package No Dependents
Maintenance 10 / 25
Adoption 10 / 25
Maturity 11 / 25
Community 10 / 25

How are scores calculated?

Stars

135

Forks

8

Language

Rust

License

Last pushed

Mar 07, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/mcp/postrv/forgemax"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.