mshojaei77/ollama_rag
fully local RAG system using ollama and faiss
This application helps you quickly find answers within your PDF documents. You provide your PDF files, ask a question in plain language, and it gives you a relevant answer pulled directly from the document's content. This is for anyone who needs to extract specific information or summarize parts of text-based PDFs without manually sifting through pages, such as researchers, students, or business analysts.
No commits in the last 6 months.
Use this if you need to quickly ask questions and get answers from a collection of your own PDF documents on your local computer, ensuring your data never leaves your machine.
Not ideal if your PDFs are scanned images without selectable text, or if you need to analyze extremely large volumes of documents with enterprise-level tools.
Stars
43
Forks
4
Language
Python
License
—
Category
Last pushed
Mar 22, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/rag/mshojaei77/ollama_rag"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
run-llama/llama_index
LlamaIndex is the leading document agent and OCR platform
emarco177/documentation-helper
Reference implementation of a RAG-based documentation helper using LangChain, Pinecone, and Tavily..
janus-llm/janus-llm
Leveraging LLMs for modernization through intelligent chunking, iterative prompting and...
JetXu-LLM/llama-github
Llama-github is an open-source Python library that empowers LLM Chatbots, AI Agents, and...
Vasallo94/ObsidianRAG
RAG system to query your Obsidian notes using LangGraph and local LLMs (Ollama)