mdegans/drama_llama
Yet another `llama.cpp` Rust wrapper
This project offers a Rust-based interface for interacting with `llama.cpp` to perform tasks with local language models. It takes language model files as input and can generate conversational responses or identify memorized content within the models. This tool is designed for developers who want to integrate or experiment with local LLMs in their Rust applications.
No commits in the last 6 months.
Use this if you are a Rust developer looking for a flexible and fine-grained control over local large language models through `llama.cpp`.
Not ideal if you are an end-user looking for a ready-to-use application, as this is a developer library and not intended for production use yet.
Stars
12
Forks
3
Language
Rust
License
—
Category
Last pushed
Jun 19, 2024
Monthly downloads
56
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/mdegans/drama_llama"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
ludwig-ai/ludwig
Low-code framework for building custom LLMs, neural networks, and other AI models
withcatai/node-llama-cpp
Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema...
mudler/LocalAI
:robot: The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and...
zhudotexe/kani
kani (カニ) is a highly hackable microframework for tool-calling language models. (NLP-OSS @ EMNLP 2023)
SciSharp/LLamaSharp
A C#/.NET library to run LLM (🦙LLaMA/LLaVA) on your local device efficiently.