romilandc/streamlit-ollama-llm

A Streamlit user interface for local LLM implementation on Ollama. With just three python apps you can have a localized LLM to chat with. I'm running Ollama Windows (just updated) and DuckDuckGo browser and it's working great as a coding assistant.

40
/ 100
Emerging

This project provides a simple chat interface for running large language models (LLMs) directly on your own computer, ensuring your conversations remain private. You input text prompts or questions, and the system generates responses or code suggestions using models like Mistral or Code Llama. This is ideal for developers, data scientists, or anyone who wants to use AI assistants without sending data to external cloud services.

No commits in the last 6 months.

Use this if you need a private, locally hosted AI assistant for coding help, general questions, or content generation and want full control over your data.

Not ideal if you prefer cloud-based AI services, need advanced LLM features not available in local models, or lack the technical comfort with command-line setup.

coding-assistant private-AI local-LLM developer-tools data-science-workflow
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 7 / 25
Maturity 16 / 25
Community 17 / 25

How are scores calculated?

Stars

28

Forks

10

Language

Python

License

Apache-2.0

Last pushed

Apr 29, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/romilandc/streamlit-ollama-llm"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.