PhilYeh1212/Local-AI-Knowledge-Base-Docker-Llama3

A production-ready, 100% offline RAG Knowledge Base using Docker, Llama 3, and Ollama. Chat with your documents privately. Enterprise architecture showcase.

27
/ 100
Experimental

This tool helps engineers and enterprises securely chat with their private PDF documents, such as technical manuals or datasheets, without exposing any data to external cloud services. You input your private PDFs and can ask questions to get precise answers from their content, all handled offline on your own machine. It's designed for technical professionals who need to query confidential documentation with full privacy.

Use this if you need to query sensitive or proprietary technical documents and absolutely cannot allow your data to leave your local environment.

Not ideal if you primarily work with non-technical, general knowledge documents or are comfortable using cloud-based AI services.

technical-documentation enterprise-IT data-security information-retrieval offline-AI
No License No Package No Dependents
Maintenance 6 / 25
Adoption 5 / 25
Maturity 5 / 25
Community 11 / 25

How are scores calculated?

Stars

12

Forks

2

Language

License

Last pushed

Dec 18, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/rag/PhilYeh1212/Local-AI-Knowledge-Base-Docker-Llama3"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.