ObsidianRAG and Local-RAG-with-Ollama

These tools are competitors, as both aim to provide a local RAG system, with the former specifically targeting Obsidian notes and the latter offering a more general framework.

ObsidianRAG
51
Established
Local-RAG-with-Ollama
40
Emerging
Maintenance 10/25
Adoption 7/25
Maturity 25/25
Community 9/25
Maintenance 2/25
Adoption 9/25
Maturity 7/25
Community 22/25
Stars: 29
Forks: 3
Downloads:
Commits (30d): 0
Language: Python
License: MIT
Stars: 76
Forks: 48
Downloads:
Commits (30d): 0
Language: Python
License:
No risk flags
No License Stale 6m No Package No Dependents

About ObsidianRAG

Vasallo94/ObsidianRAG

RAG system to query your Obsidian notes using LangGraph and local LLMs (Ollama)

This tool allows Obsidian users to ask questions in plain language and get intelligent answers directly from their personal notes. You provide your existing Obsidian vault, and the system uses local AI to retrieve and synthesize information, giving you a clear answer along with links to the original notes as sources. This is perfect for knowledge workers, researchers, or anyone who maintains a large personal knowledge base in Obsidian.

personal-knowledge-management note-taking information-retrieval research-assistance digital-librarian

About Local-RAG-with-Ollama

ThomasJanssen-tech/Local-RAG-with-Ollama

Build a 100% local Retrieval Augmented Generation (RAG) system with Python, LangChain, Ollama and ChromaDB!

This project helps Python developers build a custom chatbot that can answer questions based on their own documents. You feed it your documents, and it creates a question-answering system that runs entirely on your local machine. This is for developers who need to create specialized AI assistants without sending their data to external services.

local-AI chatbot-development data-privacy offline-AI custom-knowledge-base

Scores updated daily from GitHub, PyPI, and npm data. How scores work