shono-io/nats-ai
A nats micro service interacting with Ollama
This tool helps developers integrate large language models (LLMs) into their applications via a NATS microservice. You provide a text prompt to the service, optionally specifying an LLM and a conversation thread, and it returns a streamed text response from the LLM. It's designed for developers building real-time, event-driven applications that need to leverage local LLMs.
No commits in the last 6 months.
Use this if you are a developer building a distributed application and need to easily expose a local large language model as a service that can be called by other components.
Not ideal if you are an end-user looking for a direct chat interface with an LLM or if you don't have existing NATS infrastructure.
Stars
18
Forks
4
Language
Go
License
MIT
Category
Last pushed
Jun 30, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/shono-io/nats-ai"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
ollama/ollama
Get up and running with Kimi-K2.5, GLM-5, MiniMax, DeepSeek, gpt-oss, Qwen, Gemma and other models.
jd-opensource/JDOxyGent4J
JDOxyGent4J: The Java sibling of the OxyGent ecosystem.
sammcj/gollama
Go manage your Ollama models
dext7r/ollama-api-pool
๐ Intelligent Ollama API proxy pool based on Cloudflare Workers - ๅบไบ Cloudflare Workers ็ๆบ่ฝ...
ollama4j/ollama4j-web-ui
Web UI for Ollama built in Java with Vaadin, Spring Boot and Ollama4j