guinacio/langchain-mcp-client

This Streamlit application provides a user interface for connecting to MCP (Model Context Protocol) servers and interacting with them using different LLM providers (OpenAI, Anthropic, Google, Ollama).

52
/ 100
Established

This tool helps you interact with various AI language models (like those from OpenAI, Anthropic, or Google) and specialized AI tools through a simple chat interface. You can input text, attach documents or images, and receive streaming responses. It's designed for anyone who regularly uses large language models for tasks ranging from brainstorming to detailed analysis and needs to manage multiple conversations and tools efficiently.

Use this if you need a flexible, unified interface to chat with different AI models, attach various file types for context, and access specialized AI tools without switching between multiple platforms.

Not ideal if you only need to use a single AI model for basic text generation and do not require advanced features like file attachments, tool integration, or persistent conversation history.

AI-chat-management document-analysis multimodal-AI knowledge-retrieval AI-assistant
No Package No Dependents
Maintenance 10 / 25
Adoption 8 / 25
Maturity 16 / 25
Community 18 / 25

How are scores calculated?

Stars

47

Forks

15

Language

Python

License

MIT

Last pushed

Feb 15, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/mcp/guinacio/langchain-mcp-client"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.