NEDDL/fastify-mcp-server

🚀 High-performance MCP (Model Context Protocol) server built with Fastify, TypeScript, and functional programming. Production-ready with authentication, metrics, and auto-discovery capabilities for AI agents and LLM applications.

30
/ 100
Emerging

This project helps AI application developers create robust and scalable backend services for their AI agents and large language models (LLMs). It takes in definitions of tools, resources, and prompts, and exposes them securely as a high-performance Model Context Protocol (MCP) server. Developers building AI platforms, LLM integrations, or enterprise AI solutions would use this.

No commits in the last 6 months.

Use this if you are a developer building production-ready AI applications and need a fast, secure, and standardized way to connect your AI agents or LLMs to external capabilities.

Not ideal if you are looking for a pre-built AI application or a client-side library to consume existing MCP services, as this project is focused on the server-side implementation.

AI development LLM integration backend services microservices API development
Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 4 / 25
Maturity 15 / 25
Community 9 / 25

How are scores calculated?

Stars

7

Forks

1

Language

TypeScript

License

MIT

Last pushed

Oct 02, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/mcp/NEDDL/fastify-mcp-server"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.