yoshoku/llama_cpp.rb

llama_cpp.rb provides Ruby bindings for llama.cpp

49
/ 100
Emerging

This project helps Ruby developers integrate powerful, locally-run large language models directly into their Ruby applications. It takes a pre-trained, quantized model file (like Open Llama) as input and allows your Ruby code to send prompts and receive generated text. This is ideal for Ruby developers who want to add local AI capabilities to their applications without relying on external APIs.

232 stars.

Use this if you are a Ruby developer building applications and need to embed high-performance, local large language model inference capabilities directly into your Ruby codebase.

Not ideal if you are looking for a high-level, opinionated Ruby wrapper for LLMs or do not need to integrate LLMs directly into a Ruby application.

Ruby development local AI LLM integration application development natural language processing
No Package No Dependents
Maintenance 10 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 13 / 25

How are scores calculated?

Stars

232

Forks

18

Language

C

License

MIT

Last pushed

Mar 11, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/yoshoku/llama_cpp.rb"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.