langchain-mcp-adapters and langchain-mcp-client

The LangChain adapter provides the core integration layer enabling LLMs to access MCP server tools, while the Streamlit application builds a UI client on top of that integration to demonstrate and interact with those connected servers—making them complements that work in sequence rather than alternatives.

langchain-mcp-client
52
Established
Maintenance 20/25
Adoption 15/25
Maturity 25/25
Community 21/25
Maintenance 10/25
Adoption 8/25
Maturity 16/25
Community 18/25
Stars: 3,411
Forks: 379
Downloads:
Commits (30d): 21
Language: Python
License: MIT
Stars: 47
Forks: 15
Downloads:
Commits (30d): 0
Language: Python
License: MIT
No risk flags
No Package No Dependents

About langchain-mcp-adapters

langchain-ai/langchain-mcp-adapters

LangChain 🔌 MCP

This project helps AI developers integrate external capabilities, like custom calculators or data lookups, into their LangChain or LangGraph AI agents. It takes existing 'Model Context Protocol' (MCP) tools, which are essentially specialized functions, and makes them accessible to the AI agent. The result is an AI agent that can perform a wider range of tasks by using these external tools.

AI development LLM integration tool orchestration agent workflow external service connection

About langchain-mcp-client

guinacio/langchain-mcp-client

This Streamlit application provides a user interface for connecting to MCP (Model Context Protocol) servers and interacting with them using different LLM providers (OpenAI, Anthropic, Google, Ollama).

This tool helps you interact with various AI language models (like those from OpenAI, Anthropic, or Google) and specialized AI tools through a simple chat interface. You can input text, attach documents or images, and receive streaming responses. It's designed for anyone who regularly uses large language models for tasks ranging from brainstorming to detailed analysis and needs to manage multiple conversations and tools efficiently.

AI-chat-management document-analysis multimodal-AI knowledge-retrieval AI-assistant

Scores updated daily from GitHub, PyPI, and npm data. How scores work