yoshoku/llama_cpp.rb
llama_cpp.rb provides Ruby bindings for llama.cpp
This project helps Ruby developers integrate powerful, locally-run large language models directly into their Ruby applications. It takes a pre-trained, quantized model file (like Open Llama) as input and allows your Ruby code to send prompts and receive generated text. This is ideal for Ruby developers who want to add local AI capabilities to their applications without relying on external APIs.
232 stars.
Use this if you are a Ruby developer building applications and need to embed high-performance, local large language model inference capabilities directly into your Ruby codebase.
Not ideal if you are looking for a high-level, opinionated Ruby wrapper for LLMs or do not need to integrate LLMs directly into a Ruby application.
Stars
232
Forks
18
Language
C
License
MIT
Category
Last pushed
Mar 11, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/yoshoku/llama_cpp.rb"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
ludwig-ai/ludwig
Low-code framework for building custom LLMs, neural networks, and other AI models
withcatai/node-llama-cpp
Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema...
mudler/LocalAI
:robot: The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and...
zhudotexe/kani
kani (カニ) is a highly hackable microframework for tool-calling language models. (NLP-OSS @ EMNLP 2023)
SciSharp/LLamaSharp
A C#/.NET library to run LLM (🦙LLaMA/LLaVA) on your local device efficiently.