ashtewari/bookshelf
LLM RAG enabled bookshelf
This tool helps you quickly find information within your collection of books and documents. You input your personal library of PDFs, and it allows you to ask questions in plain language to get precise answers extracted directly from your texts. Anyone who frequently needs to recall specific details from many books or extensive documentation, such as researchers, students, or knowledge workers, would find this useful.
No commits in the last 6 months.
Use this if you have a large personal library of digital books and documents and need a fast way to get answers to your questions directly from your collection.
Not ideal if you primarily work with web articles, external databases, or need to generate creative content rather than extract information from existing sources.
Stars
10
Forks
2
Language
Python
License
MIT
Category
Last pushed
Mar 11, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/rag/ashtewari/bookshelf"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
run-llama/llama_index
LlamaIndex is the leading document agent and OCR platform
emarco177/documentation-helper
Reference implementation of a RAG-based documentation helper using LangChain, Pinecone, and Tavily..
janus-llm/janus-llm
Leveraging LLMs for modernization through intelligent chunking, iterative prompting and...
JetXu-LLM/llama-github
Llama-github is an open-source Python library that empowers LLM Chatbots, AI Agents, and...
Vasallo94/ObsidianRAG
RAG system to query your Obsidian notes using LangGraph and local LLMs (Ollama)