ineelhere/shiny.ollama
Chat offline with open-source LLMs like deepseek-r1, nemotron, qwen, llama and more all through a simple R package powered by Shiny and Ollama. 🚀
This R Shiny application allows you to chat privately with powerful AI models directly on your computer, without needing an internet connection. You provide your questions or prompts, and the AI responds using models like Deepseek or Llama that you've downloaded. This is ideal for researchers, data analysts, or anyone who wants to interact with large language models while keeping their conversations completely confidential and on their own machine.
No commits in the last 6 months.
Use this if you need to brainstorm, analyze text, or generate content using AI models offline and with absolute privacy, especially if you are comfortable with the R environment.
Not ideal if you prefer cloud-based AI services or do not want to manage local software installations like Ollama and R.
Stars
21
Forks
—
Language
R
License
Apache-2.0
Category
Last pushed
Mar 12, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/ineelhere/shiny.ollama"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
SillyTavern/SillyTavern
LLM Frontend for Power Users.
libre-webui/libre-webui
Privacy-first web interface for local AI models. Clean, minimal UI for Ollama with extensible...
chyok/ollama-gui
A single-file tkinter-based Ollama GUI project with no external dependencies.
matlab-deep-learning/llms-with-matlab
Connect MATLAB to LLM APIs, including OpenAI® Chat Completions, Azure® OpenAI Services, and Ollama™
ollama4j/ollama4j
A simple Java library for interacting with Ollama server.