Unmortan-Ellary/Vascura-FRONT
Bloat Free, Portable and Lightweight LLM Frontend (Single HTML file). With Lorebook, Web Search, Macro Engine etc.
Vascura FRONT is a lightweight, portable web interface that helps you interact with large language models. You input your prompts and get back AI-generated text, with advanced features to manage context, integrate web search results, and use dynamic content. It's designed for anyone who regularly uses AI chatbots for creative writing, research, or complex conversational tasks.
Use this if you need a flexible, standalone AI chat interface that runs directly in your browser without any installations, offering powerful tools to control AI responses and integrate external knowledge.
Not ideal if you're looking for a simple, no-frills chat experience without needing advanced features like Lorebooks, custom macros, or deep control over AI context.
Stars
19
Forks
1
Language
HTML
License
Apache-2.0
Category
Last pushed
Mar 12, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/Unmortan-Ellary/Vascura-FRONT"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
ParisNeo/lollms-webui
Lord of Large Language and Multi modal Systems Web User Interface
ggozad/oterm
the terminal client for Ollama
owndev/Open-WebUI-Functions
Open-WebUI-Functions is a collection of custom pipelines, filters, and integrations designed to...
hand-e-fr/OpenHosta
A lightweight library integrating LLM natively into Python
lmg-anon/mikupad
LLM Frontend in a single html file