RhinoDevel/mt_llm

Pure C wrapper library to use llama.cpp with Linux and Windows as simple as possible.

36
/ 100
Emerging

This is a C library that simplifies integrating large language models (LLMs) into single-user applications on Linux and Windows. It allows developers to feed text prompts into an LLM and receive generated text, embeddings, or ranking scores as output. This tool is for C/C++ developers who want to embed local LLM capabilities directly into their software without dealing with complex configurations.

Use this if you are a C or C++ developer building a desktop application and want to add simple, local LLM inference capabilities, like text generation or embeddings, without extensive setup.

Not ideal if you need a solution for web-based applications, multi-user deployments, or prefer to work with higher-level programming languages like Python.

Application Development Local AI Desktop Software C/C++ Programming Language Model Integration
No Package No Dependents
Maintenance 10 / 25
Adoption 5 / 25
Maturity 15 / 25
Community 6 / 25

How are scores calculated?

Stars

14

Forks

1

Language

C++

License

MIT

Last pushed

Mar 11, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/RhinoDevel/mt_llm"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.