josStorer/llama.cpp-unicode-windows
llama.cpp with unicode (windows) support
This project helps Windows users run large language models (LLMs) on their own computer, specifically supporting non-English languages like Chinese. You provide a language model file, and it allows you to chat with or query the model directly on your desktop. This is for individuals or small teams who want to use LLMs offline, especially for tasks involving East Asian languages.
No commits in the last 6 months.
Use this if you are a Windows user needing to run an LLM on your local machine, particularly for interacting with content in languages like Chinese, without needing an internet connection.
Not ideal if you need a web-based interface, advanced integration with other applications, or are working primarily with English-only models where standard llama.cpp might suffice.
Stars
54
Forks
4
Language
C
License
MIT
Category
Last pushed
Apr 04, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/josStorer/llama.cpp-unicode-windows"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
mishushakov/llm-scraper
Turn any webpage into structured data using LLMs
Mobile-Artificial-Intelligence/maid
Maid is a free and open source application for interfacing with llama.cpp models locally, and...
run-llama/LlamaIndexTS
Data framework for your LLM applications. Focus on server side solution
JHubi1/ollama-app
A modern and easy-to-use client for Ollama
serge-chat/serge
A web interface for chatting with Alpaca through llama.cpp. Fully dockerized, with an easy to use API.