foldl/chatllm.cpp

Pure C++ implementation of several models for real-time chatting on your computer (CPU & GPU)

59
/ 100
Established

This project helps you run powerful conversational AI models directly on your own computer, even without a high-end setup. You can feed in text, images, or audio and get back real-time chat responses, summaries, or even generate new content. It's designed for individuals who want to use large language models for personal tasks, research, or creative writing with full control over their data.

831 stars. Actively maintained with 11 commits in the last 30 days.

Use this if you want to run various AI chat models privately on your desktop or laptop, customize their behavior, and experiment with different models for tasks like writing assistance, coding help, or general knowledge retrieval.

Not ideal if you're looking for a simple, cloud-based AI chat service that doesn't require any local setup or technical configuration.

personal-ai local-llm creative-writing research-assistant knowledge-retrieval
No Package No Dependents
Maintenance 17 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 16 / 25

How are scores calculated?

Stars

831

Forks

62

Language

C++

License

MIT

Last pushed

Mar 11, 2026

Commits (30d)

11

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/foldl/chatllm.cpp"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.