jmont-dev/ollama-hpp

Modern, Header-only C++ bindings for the Ollama API.

48
/ 100
Emerging

This is a C++ library that allows C++ developers to easily integrate local large language models (LLMs) into their C++ applications. It takes model names and user prompts as input, and outputs generated text or chat responses from the LLM. C++ developers who want to add AI capabilities using local LLMs to their software would use this.

213 stars.

Use this if you are a C++ developer building an application and want to incorporate the power of local Ollama-compatible large language models directly into your C++ code.

Not ideal if you are not a C++ developer, or if you prefer to use cloud-based LLM APIs rather than running models locally.

C++ development Local AI integration Application development LLM integration
No Package No Dependents
Maintenance 6 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 16 / 25

How are scores calculated?

Stars

213

Forks

27

Language

C++

License

MIT

Last pushed

Oct 20, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/jmont-dev/ollama-hpp"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.