vndee/local-talking-llm
A talking LLM that runs on your own computer without needing the internet.
This project helps you create your own personalized voice assistant that runs entirely on your computer without needing an internet connection. You speak into your microphone, the assistant processes your request using a local language model, and then speaks its response back to you. This is ideal for anyone who wants a private, customizable AI assistant like Jarvis or Friday, that can even mimic a specific voice.
795 stars.
Use this if you want a private, offline voice assistant capable of natural conversation and voice cloning, running directly from your personal computer.
Not ideal if you're looking for an out-of-the-box, plug-and-play smart speaker solution without any technical setup.
Stars
795
Forks
167
Language
Python
License
MIT
Category
Last pushed
Oct 20, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/vndee/local-talking-llm"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related tools
SillyTavern/SillyTavern
LLM Frontend for Power Users.
libre-webui/libre-webui
Privacy-first web interface for local AI models. Clean, minimal UI for Ollama with extensible...
chyok/ollama-gui
A single-file tkinter-based Ollama GUI project with no external dependencies.
matlab-deep-learning/llms-with-matlab
Connect MATLAB to LLM APIs, including OpenAI® Chat Completions, Azure® OpenAI Services, and Ollama™
ollama4j/ollama4j
A simple Java library for interacting with Ollama server.