yoanbernabeu/S3-Documentation-MCP-Server

A lightweight Model Context Protocol (MCP) server that brings RAG (Retrieval-Augmented Generation) capabilities to your LLM over Markdown documentation stored on S3.

34
/ 100
Emerging

This tool helps developers, product managers, or content creators make their Markdown documentation, internal wikis, or API docs searchable and summarizable using AI. It takes your existing Markdown files stored in an S3-compatible cloud storage and turns them into an intelligent knowledge base. The output is a set of AI-powered search capabilities directly accessible within popular large language model (LLM) clients like Cursor or Claude Desktop, enabling users to get answers or find information quickly.

Use this if you need to provide AI-powered search and question-answering over your organization's documentation, product manuals, or internal knowledge base, especially if those documents are stored as Markdown files in S3.

Not ideal if your documentation is not in Markdown format, is not stored in S3, or if you require a production-ready system with guaranteed backward compatibility and advanced security features right now, as this is a work in progress.

product-documentation internal-wiki api-documentation knowledge-management developer-support
No Package No Dependents
Maintenance 6 / 25
Adoption 4 / 25
Maturity 15 / 25
Community 9 / 25

How are scores calculated?

Stars

7

Forks

1

Language

TypeScript

License

MIT

Last pushed

Oct 31, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/mcp/yoanbernabeu/S3-Documentation-MCP-Server"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.