ineelhere/shiny.ollama

Chat offline with open-source LLMs like deepseek-r1, nemotron, qwen, llama and more all through a simple R package powered by Shiny and Ollama. 🚀

22
/ 100
Experimental

This R Shiny application allows you to chat privately with powerful AI models directly on your computer, without needing an internet connection. You provide your questions or prompts, and the AI responds using models like Deepseek or Llama that you've downloaded. This is ideal for researchers, data analysts, or anyone who wants to interact with large language models while keeping their conversations completely confidential and on their own machine.

No commits in the last 6 months.

Use this if you need to brainstorm, analyze text, or generate content using AI models offline and with absolute privacy, especially if you are comfortable with the R environment.

Not ideal if you prefer cloud-based AI services or do not want to manage local software installations like Ollama and R.

local-AI-chat private-data-analysis offline-language-modeling R-programming confidential-AI-interaction
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 6 / 25
Maturity 16 / 25
Community 0 / 25

How are scores calculated?

Stars

21

Forks

Language

R

License

Apache-2.0

Last pushed

Mar 12, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/ineelhere/shiny.ollama"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.