li-plus/chatglm.cpp

C++ implementation of ChatGLM-6B & ChatGLM2-6B & ChatGLM3 & GLM4(V)

47
/ 100
Emerging

This project allows you to run powerful ChatGLM language models directly on your computer, even a MacBook, for real-time conversations. You input text prompts or images, and it generates human-like responses, code, or executes functions. It's designed for individuals who want to use large language models locally for tasks like drafting content, getting quick answers, or generating code.

2,960 stars. No commits in the last 6 months.

Use this if you want to run advanced conversational AI models on your own machine for privacy, speed, or offline access, leveraging your computer's CPU or GPU.

Not ideal if you prefer cloud-based AI services or don't have the technical comfort to set up and convert models locally.

local-ai personal-chatbot content-generation code-generation natural-language-processing
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 21 / 25

How are scores calculated?

Stars

2,960

Forks

329

Language

C++

License

MIT

Last pushed

Jul 31, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/li-plus/chatglm.cpp"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.