sunshine0523/OllamaServer
No need for Termux, you can start the Ollama service with one click on an Android device. 无需Termux,在安卓设备上一键启动Ollama服务。
This tool helps developers easily run Ollama, a service for running large language models, directly on their Android devices. It simplifies the setup process by allowing one-click deployment and model management. Developers who want to test or utilize language models on mobile hardware will find this useful.
312 stars. No commits in the last 6 months.
Use this if you are a developer looking to run and manage Ollama language models directly on an Android device without complex setup.
Not ideal if you are an end-user without programming knowledge, as it requires interacting with the Ollama service via an API.
Stars
312
Forks
37
Language
TypeScript
License
GPL-3.0
Category
Last pushed
May 06, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/sunshine0523/OllamaServer"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
SillyTavern/SillyTavern
LLM Frontend for Power Users.
libre-webui/libre-webui
Privacy-first web interface for local AI models. Clean, minimal UI for Ollama with extensible...
chyok/ollama-gui
A single-file tkinter-based Ollama GUI project with no external dependencies.
matlab-deep-learning/llms-with-matlab
Connect MATLAB to LLM APIs, including OpenAI® Chat Completions, Azure® OpenAI Services, and Ollama™
ollama4j/ollama4j
A simple Java library for interacting with Ollama server.