LLocalSearch and local-genAI-search
Both tools are competitors, as each is a locally running generative search engine powered by LLMs (specifically Llama 3 in B's case) designed to answer user questions, with A aggregating search results and B focusing on local files.
About LLocalSearch
nilsherzig/LLocalSearch
LLocalSearch is a completely locally running search aggregator using LLM Agents. The user can ask a question and the system will use a chain of LLMs to find the answer. The user can see the progress of the agents and the final answer. No OpenAI or Google API keys are needed.
LLocalSearch helps you get direct answers to your questions by searching the internet without relying on big tech companies. You input a question, and it provides a researched answer along with links and logs showing how it found the information. This tool is for anyone who needs to quickly find information online but wants to maintain privacy and avoid potentially manipulated search results.
About local-genAI-search
nikolamilosevic86/local-genAI-search
Local-GenAI-Search is a generative search engine based on Llama 3, langchain and qdrant that answers questions based on your local files
This tool helps you quickly find answers and information within your own collection of local documents like PDFs, Word files, and PowerPoints. You feed it a folder of your documents, and you get a conversational search engine that can answer your questions and point you to the exact files for reference. It's ideal for anyone who needs to extract specific information from a large personal or team document archive.
Scores updated daily from GitHub, PyPI, and npm data. How scores work