gbaptista/ollama-ai
A Ruby gem for interacting with Ollama's API that allows you to run open source AI LLMs (Large Language Models) locally.
This is a Ruby tool for developers to integrate open-source large language models (LLMs) into their applications. It takes natural language prompts and model specifications as input, and outputs text completions, chat responses, or numerical embeddings. Ruby developers can use this to build custom AI features directly into their Ruby-based software.
255 stars. No commits in the last 6 months.
Use this if you are a Ruby developer looking to programmatically control and integrate locally-run Ollama LLMs into your custom applications, requiring low-level API access.
Not ideal if you are an end-user seeking a ready-to-use, high-level AI application or a non-developer looking for a user-friendly interface.
Stars
255
Forks
12
Language
Ruby
License
MIT
Category
Last pushed
Jul 21, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/gbaptista/ollama-ai"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
mishushakov/llm-scraper
Turn any webpage into structured data using LLMs
Mobile-Artificial-Intelligence/maid
Maid is a free and open source application for interfacing with llama.cpp models locally, and...
run-llama/LlamaIndexTS
Data framework for your LLM applications. Focus on server side solution
JHubi1/ollama-app
A modern and easy-to-use client for Ollama
serge-chat/serge
A web interface for chatting with Alpaca through llama.cpp. Fully dockerized, with an easy to use API.