tib0/local-llama

Local Llama project, L³ is an electron app that runs llama 3 models locally

29
/ 100
Experimental

This desktop application allows you to chat with powerful AI language models directly on your computer, without needing an internet connection or external services. You input your questions or prompts, and the AI generates responses, which you can save for later. It's designed for anyone who wants to use advanced AI chatbots privately and offline.

No commits in the last 6 months.

Use this if you need to interact with large language models and prioritize data privacy, offline access, and full control over the AI's behavior and models.

Not ideal if you prefer using cloud-based AI services or do not have sufficient computer resources to run large AI models locally.

personal-productivity private-data-analysis offline-ai-chat secure-language-processing
Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 6 / 25
Maturity 16 / 25
Community 5 / 25

How are scores calculated?

Stars

17

Forks

1

Language

TypeScript

License

Last pushed

Jun 05, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/tib0/local-llama"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.