ShelbyJenkins/llm_client

The Easiest Rust Interface for Local LLMs and an Interface for Deterministic Signals from Probabilistic LLM Vibes

45
/ 100
Emerging

This project provides a robust and easy-to-use Rust interface for `llama.cpp`'s `llama-server`. It simplifies managing the `llama.cpp` toolchain and interacting with local large language models (LLMs) for tasks like text generation, text infilling, and creating text embeddings. Developers building applications that need to integrate local LLM capabilities would use this.

245 stars and 10 monthly downloads. No commits in the last 6 months.

Use this if you are a Rust developer looking for a straightforward, fully-typed way to integrate and manage local `llama.cpp` servers within your applications on Linux, macOS, or Windows.

Not ideal if you are looking for a high-level agent building framework or a tool for interacting with cloud-based LLMs rather than local ones.

Rust development local AI LLM integration application development machine learning engineering
Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 12 / 25
Maturity 16 / 25
Community 15 / 25

How are scores calculated?

Stars

245

Forks

25

Language

Rust

License

MIT

Last pushed

Aug 06, 2025

Monthly downloads

10

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/ShelbyJenkins/llm_client"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.