nobodywho-ooo/nobodywho

NobodyWho is an inference engine that lets you run LLMs locally and efficiently on any device.

69
/ 100
Established

This library helps developers integrate large language models (LLMs) directly into their applications, allowing these AI models to run offline and on the user's device. Developers provide an LLM in GGUF format and receive text responses, enabling features like chatbots or AI assistants without cloud dependency. It's ideal for application developers building desktop, mobile, or game experiences who want to embed AI capabilities directly.

744 stars. Actively maintained with 42 commits in the last 30 days. Available on PyPI.

Use this if you are an application developer looking to embed offline, efficient AI capabilities directly into your Python, Flutter, or Godot applications on Windows, Linux, macOS, or Android.

Not ideal if you need to deploy AI models on iOS or web platforms, or if you are not a developer and are looking for a ready-to-use AI application.

application-development game-development mobile-app-development offline-ai AI-integration
No Dependents
Maintenance 20 / 25
Adoption 10 / 25
Maturity 25 / 25
Community 14 / 25

How are scores calculated?

Stars

744

Forks

43

Language

Rust

License

EUPL-1.2

Last pushed

Mar 13, 2026

Commits (30d)

42

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/nobodywho-ooo/nobodywho"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.