RightNow-AI/picolm

Run a 1-billion parameter LLM on a $10 board with 256MB RAM

52
/ 100
Established

PicoLM helps developers integrate powerful language AI directly into small, low-cost devices like a Raspberry Pi Zero, creating fully offline AI agents. It takes text input (like a question or command) and produces a text response or structured JSON, without needing internet access or expensive cloud services. This is ideal for developers building embedded AI solutions or privacy-focused applications.

1,364 stars.

Use this if you are a developer looking to add a local, privacy-preserving large language model to a project running on budget-friendly hardware like a Raspberry Pi, or to create an offline AI assistant.

Not ideal if you need to run large, state-of-the-art LLMs that require significant computational power, or if your application already relies on cloud-based AI services.

embedded-systems edge-computing offline-ai hardware-integration privacy-first-applications
No Package No Dependents
Maintenance 10 / 25
Adoption 10 / 25
Maturity 11 / 25
Community 21 / 25

How are scores calculated?

Stars

1,364

Forks

159

Language

C

License

MIT

Last pushed

Feb 22, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/RightNow-AI/picolm"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.