mariochavez/llm_server
Rack API application for Llama.cpp
This project helps Ruby developers integrate large language models (LLMs) into their applications. It provides a straightforward API endpoint where your Ruby application can send text as input and receive text completions generated by an LLM as output. Developers creating Ruby-based tools that need AI text generation will find this useful.
No commits in the last 6 months.
Use this if you are a Ruby developer building an application and want to incorporate local LLM text generation capabilities via an easy-to-use API.
Not ideal if you are not a Ruby developer or you need advanced features beyond basic text completion, such as fine-tuning models or complex multi-turn conversations.
Stars
39
Forks
1
Language
Ruby
License
MIT
Category
Last pushed
Jul 18, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/mariochavez/llm_server"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
microsoft/multilspy
multilspy is a lsp client library in Python intended to be used to build applications around...
mlc-ai/xgrammar
Fast, Flexible and Portable Structured Generation
vicentereig/dspy.rb
The Ruby framework for programming—rather than prompting—language models.
feenkcom/gt4llm
A GT package for working with LLMs
Evref-BL/Pharo-LLMAPI
Use LLM API from Pharo