local-skills-mcp and skill-mcp
These are ecosystem siblings, as the "local-skills-mcp" server (A) enables agents to use skills from the local filesystem via the MCP protocol, which is then leveraged by the "skill-mcp" platform (B) to programmatically manage and execute skills for LLMs through any MCP-compatible client.
About local-skills-mcp
kdpa-llc/local-skills-mcp
Universal MCP server enabling any LLM or AI agent to utilize expert skills from your local filesystem. Reduces context consumption through lazy loading. Works with Claude, Cline, and any MCP-compatible client.
This project helps AI developers and power users extend the capabilities of their AI agents and Large Language Models (LLMs). It allows you to organize and store specialized instructions, or "skills," for your AI on your local computer. Your AI agents can then access these skills on demand, allowing them to perform complex, domain-specific tasks without consuming excessive context.
About skill-mcp
fkesheh/skill-mcp
LLM-managed skills platform using MCP - create, edit, and execute skills programmatically in Claude, Cursor, and any MCP-compatible client without manual file uploads.
This system helps AI agents like Claude manage and execute specialized functions, or 'skills', automatically. Instead of manually uploading code or files, the AI can create, edit, and run these skills on the fly. It takes natural language commands from the AI and allows it to produce unified code that combines multiple skills, acting as an extension of the AI's capabilities for complex tasks.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work