hwpoison/llamacpp-terminal-chat

A lightweight chat terminal-interface for llama.cpp server written in C++ with many features and windows/linux support.

30
/ 100
Emerging

This tool provides a terminal-based chat interface for interacting with a local language model server (specifically, `llama.cpp` server). You can input text prompts and receive AI-generated responses directly in your command line. It's designed for anyone who wants to test, experiment with, or simply chat with a large language model running on their own machine, without needing a web browser or complex programming.

No commits in the last 6 months.

Use this if you want a flexible, command-line interface to interact with your local `llama.cpp` server for prototyping, testing, or casual conversation.

Not ideal if you need a graphical user interface (GUI) or are looking for a cloud-based language model service.

AI-chat local-LLM-interaction prompt-engineering language-model-testing offline-AI-chat
No License Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 7 / 25
Maturity 8 / 25
Community 13 / 25

How are scores calculated?

Stars

25

Forks

4

Language

C++

License

Last pushed

May 31, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/hwpoison/llamacpp-terminal-chat"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.