web-llm and llm-x
These are complements: web-llm provides the core in-browser inference engine that llm-x uses as its underlying LLM runtime to deliver a user-friendly web interface for local model execution.
About web-llm
mlc-ai/web-llm
High-performance In-browser LLM Inference Engine
WebLLM helps web developers integrate powerful AI language models directly into their web applications, running entirely within the user's browser. It takes model files and instructions, producing AI-generated text, chat responses, or structured JSON data. Web developers creating interactive, privacy-focused AI experiences for their users would benefit from this.
About llm-x
mrdjohnson/llm-x
LLMX; Easiest 3rd party Local LLM UI for the web!
This tool helps you interact with various AI models on your own computer, keeping your conversations and generated content completely private. You can input text or images to get text responses or generate new images, all without sending your data to external servers. It's designed for anyone who wants to use AI for tasks like drafting text, getting explanations, or creating images while ensuring maximum privacy.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work