cocktailpeanut/dalai

The simplest way to run LLaMA on your local machine

38
/ 100
Emerging

This project helps you run large language models, specifically LLaMA and Alpaca, directly on your personal computer. It takes the model files as input and provides an accessible web interface and APIs for you to interact with the models. Anyone interested in experimenting with local AI chat models without needing cloud services would find this useful.

12,980 stars. No commits in the last 6 months.

Use this if you want to run LLaMA or Alpaca language models locally on your Mac, Windows, or Linux machine for personal use or development.

Not ideal if you need to run highly specialized or custom large language models that are not LLaMA or Alpaca, or if your computer has very limited memory and disk space.

AI experimentation local LLM natural language processing personal AI machine learning demo
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 8 / 25
Community 20 / 25

How are scores calculated?

Stars

12,980

Forks

1,352

Language

CSS

License

Last pushed

Jun 18, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/cocktailpeanut/dalai"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.