boneylizard/Eloquent
The most feature-complete local AI workstation. Multi-GPU inference, integrated Stable Diffusion + ADetailer, voice cloning, research-grade ELO testing, and tool-calling code editor. 100% local. Zero subscriptions. Your GPUs deserve better.
Eloquent is a comprehensive desktop application that lets you run various AI tasks locally on your powerful Windows PC with an NVIDIA GPU. It takes your text prompts, audio samples, or image inputs and generates AI chat responses, images, cloned voices, and even helps evaluate different AI models. This tool is ideal for writers, roleplayers, AI researchers, and anyone who wants to use advanced AI capabilities without relying on cloud services or subscriptions.
Use this if you are a power user with a Windows PC and NVIDIA GPU who wants a single application for local AI chat, image generation, voice cloning, and AI model evaluation, prioritizing privacy and avoiding cloud subscriptions.
Not ideal if you don't have an NVIDIA GPU or are using a Mac or Linux operating system, as this application is exclusive to Windows and relies on NVIDIA hardware.
Stars
56
Forks
6
Language
Python
License
AGPL-3.0
Category
Last pushed
Feb 24, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/boneylizard/Eloquent"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
SillyTavern/SillyTavern
LLM Frontend for Power Users.
libre-webui/libre-webui
Privacy-first web interface for local AI models. Clean, minimal UI for Ollama with extensible...
chyok/ollama-gui
A single-file tkinter-based Ollama GUI project with no external dependencies.
matlab-deep-learning/llms-with-matlab
Connect MATLAB to LLM APIs, including OpenAI® Chat Completions, Azure® OpenAI Services, and Ollama™
ollama4j/ollama4j
A simple Java library for interacting with Ollama server.