pors/langchain-chat-websockets

LangChain LLM chat with streaming response over websockets

37
/ 100
Emerging

This project provides a ready-to-use framework for building conversational AI applications that can deliver responses in real-time. It takes user queries as input and streams back AI-generated text, making interactions feel more dynamic and immediate. Developers and IT professionals can use this to integrate advanced chat functionalities into their web applications.

No commits in the last 6 months.

Use this if you need to quickly set up a chat interface that uses large language models and streams responses to users as they are generated, rather than waiting for a full reply.

Not ideal if you are looking for a pre-built chat application with a user interface, as this project focuses on the backend infrastructure for streaming LLM responses.

AI-powered chat web application development real-time communication backend infrastructure language model integration
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 9 / 25
Maturity 16 / 25
Community 12 / 25

How are scores calculated?

Stars

97

Forks

9

Language

HTML

License

Apache-2.0

Last pushed

Nov 30, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/pors/langchain-chat-websockets"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.