romilandc/streamlit-ollama-llm
A Streamlit user interface for local LLM implementation on Ollama. With just three python apps you can have a localized LLM to chat with. I'm running Ollama Windows (just updated) and DuckDuckGo browser and it's working great as a coding assistant.
This project provides a simple chat interface for running large language models (LLMs) directly on your own computer, ensuring your conversations remain private. You input text prompts or questions, and the system generates responses or code suggestions using models like Mistral or Code Llama. This is ideal for developers, data scientists, or anyone who wants to use AI assistants without sending data to external cloud services.
No commits in the last 6 months.
Use this if you need a private, locally hosted AI assistant for coding help, general questions, or content generation and want full control over your data.
Not ideal if you prefer cloud-based AI services, need advanced LLM features not available in local models, or lack the technical comfort with command-line setup.
Stars
28
Forks
10
Language
Python
License
Apache-2.0
Category
Last pushed
Apr 29, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/romilandc/streamlit-ollama-llm"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
posit-dev/chatlas
Your friendly guide to building LLM chat apps in Python with less effort and more clarity.
xming521/WeClone
🚀 One-stop solution for creating your AI twin from chat history 💡 Fine-tune LLMs with your chat...
ooyinet/WeClone
🚀从聊天记录创造数字分身的一站式解决方案💡 使用聊天记录微调大语言模型,让大模型有“那味儿”,并绑定到聊天机器人,实现自己的数字分身。 数字克隆/数字分身/数字永生/LLM/聊天机器人/LoRA
vemonet/libre-chat
🦙 Free and Open Source Large Language Model (LLM) chatbot web UI and API. Self-hosted, offline...
qqqqqf-q/MirrorFlow
从对话数据到训练:数字分身 + 模型蒸馏 From Dialogue Data to Training Closed-Loop: Digital Twin + Model Distillation