alexrozanski/LlamaChat
Chat with your favourite LLaMA models in a native macOS app
This app allows you to chat directly with various LLaMA-based language models, such as Alpaca and GPT4All, right on your macOS computer without needing an internet connection. You provide the model files, and the app processes your text input to generate responses. It's designed for anyone who wants to experiment with or regularly use large language models locally and privately on their Mac.
1,514 stars. No commits in the last 6 months.
Use this if you want to run LLaMA, Alpaca, or GPT4All language models directly on your Mac for local, private conversations or text generation.
Not ideal if you're looking for a cloud-based AI assistant or don't want to manage model file downloads and local setup yourself.
Stars
1,514
Forks
61
Language
Swift
License
MIT
Category
Last pushed
Jun 09, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/alexrozanski/LlamaChat"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
mishushakov/llm-scraper
Turn any webpage into structured data using LLMs
Mobile-Artificial-Intelligence/maid
Maid is a free and open source application for interfacing with llama.cpp models locally, and...
run-llama/LlamaIndexTS
Data framework for your LLM applications. Focus on server side solution
JHubi1/ollama-app
A modern and easy-to-use client for Ollama
serge-chat/serge
A web interface for chatting with Alpaca through llama.cpp. Fully dockerized, with an easy to use API.