skye-harris/hass_local_openai_llm

Home Assistant LLM integration for local OpenAI-compatible services (llamacpp, vllm, etc)

39
/ 100
Emerging

This integration allows you to use powerful local language models (LLMs) like Llama.cpp or vLLM directly within your Home Assistant smart home setup. It takes your voice commands or text inputs and uses the local LLM to understand and respond, enabling more intelligent home automation and interactions. Home Assistant users who want advanced, private, and customizable AI for their smart home assistant would use this.

100 stars.

Use this if you want to integrate a private, self-hosted, and customizable large language model directly into your Home Assistant to power conversational AI and automation.

Not ideal if you prefer cloud-based AI services or do not have the technical expertise to set up and manage local LLM inference servers.

home-automation smart-home-AI private-LLM conversational-AI local-inference
No License No Package No Dependents
Maintenance 10 / 25
Adoption 9 / 25
Maturity 5 / 25
Community 15 / 25

How are scores calculated?

Stars

100

Forks

14

Language

Python

License

Last pushed

Mar 11, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/rag/skye-harris/hass_local_openai_llm"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.