FlowLLM-AI/flowllm
FlowLLM: Simplifying LLM-based HTTP/MCP Service Development
FlowLLM helps developers build AI-powered services by packaging Large Language Models (LLMs), embeddings, and vector databases into accessible HTTP or MCP services. It takes your custom AI logic and configuration, then generates ready-to-use API endpoints or command-line tools. This is for developers or teams looking to quickly deploy AI assistants, RAG applications, or complex AI workflows.
Used by 2 other packages. Available on PyPI.
Use this if you are a developer who needs to rapidly prototype and deploy LLM-based applications as services, without manually handling the API setup for each component.
Not ideal if you are looking for a no-code solution or a simple library for direct LLM interaction within an existing application.
Stars
32
Forks
2
Language
Python
License
Apache-2.0
Category
Last pushed
Feb 18, 2026
Commits (30d)
0
Dependencies
19
Reverse dependents
2
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/mcp/FlowLLM-AI/flowllm"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
thedaviddias/mcp-llms-txt-explorer
MCP to explore websites with llms.txt files
jonigl/ollama-mcp-bridge
Extend the Ollama API with dynamic AI tool integration from multiple MCP (Model Context...
sib-swiss/sparql-llm
🦜✨ Chat system, MCP server, and reusable components to improve LLMs capabilities when generating...
CodeLogicIncEngineering/codelogic-mcp-server
An MCP Server to utilize Codelogic's rich software dependency data in your AI programming assistant.
webworn/openfoam-mcp-server
LLM-powered OpenFOAM MCP server for intelligent CFD education with Socratic questioning and...