rodneylab/local-ai-llm-playground
Experiments running offline LLMs in Python and Rust locally using Ollama and llama.cpp
22
/ 100
Experimental
No Package
No Dependents
Maintenance
6 / 25
Adoption
1 / 25
Maturity
15 / 25
Community
0 / 25
Stars
1
Forks
—
Language
Rust
License
BSD-3-Clause
Category
Last pushed
Dec 01, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/rodneylab/local-ai-llm-playground"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.