langchain-mcp-adapters and langchain-mcp-client
The LangChain adapter provides the core integration layer enabling LLMs to access MCP server tools, while the Streamlit application builds a UI client on top of that integration to demonstrate and interact with those connected servers—making them complements that work in sequence rather than alternatives.
About langchain-mcp-adapters
langchain-ai/langchain-mcp-adapters
LangChain 🔌 MCP
This project helps AI developers integrate external capabilities, like custom calculators or data lookups, into their LangChain or LangGraph AI agents. It takes existing 'Model Context Protocol' (MCP) tools, which are essentially specialized functions, and makes them accessible to the AI agent. The result is an AI agent that can perform a wider range of tasks by using these external tools.
About langchain-mcp-client
guinacio/langchain-mcp-client
This Streamlit application provides a user interface for connecting to MCP (Model Context Protocol) servers and interacting with them using different LLM providers (OpenAI, Anthropic, Google, Ollama).
This tool helps you interact with various AI language models (like those from OpenAI, Anthropic, or Google) and specialized AI tools through a simple chat interface. You can input text, attach documents or images, and receive streaming responses. It's designed for anyone who regularly uses large language models for tasks ranging from brainstorming to detailed analysis and needs to manage multiple conversations and tools efficiently.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work