run-llama/mcp-server-llamacloud

A MCP server connecting to managed indexes on LlamaCloud

46
/ 100
Emerging

This is a server that connects your local AI assistant, like Claude Desktop, to your specialized data indexes stored on LlamaCloud. It takes your defined LlamaCloud indexes and makes them searchable tools directly within your assistant. Anyone using an AI assistant for information retrieval from their own specific datasets would benefit.

No commits in the last 6 months.

Use this if you want to connect a desktop AI assistant (like Claude, Windsurf, or Cursor) to your own private, structured information repositories hosted on LlamaCloud for tailored searches.

Not ideal if you don't use LlamaCloud for managing your data indexes or if you don't use a local AI assistant that supports the Model Context Protocol (MCP).

information-retrieval ai-assistant-integration private-data-search knowledge-management
Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 9 / 25
Maturity 16 / 25
Community 19 / 25

How are scores calculated?

Stars

86

Forks

18

Language

JavaScript

License

MIT

Last pushed

Jun 24, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/mcp/run-llama/mcp-server-llamacloud"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.