LexiestLeszek/web-search-ollama-qwen-local
Local LLM Web search using qwen model and Ollama
This tool helps you perform web searches and get summarized answers without sending your queries or information to external services. You provide a question or topic, and it fetches relevant web content to give you a concise, direct answer. This is ideal for anyone who needs to quickly find information from the web while maintaining strict privacy and data control, such as researchers, security-conscious individuals, or small businesses.
No commits in the last 6 months.
Use this if you need to perform web searches and generate summarized answers using a large language model that runs entirely on your own computer, ensuring your data remains private.
Not ideal if you need the absolute latest real-time information, require complex, multi-turn conversational AI interactions, or don't have the local computing resources to run a large language model.
Stars
15
Forks
5
Language
Python
License
—
Category
Last pushed
Feb 09, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/rag/LexiestLeszek/web-search-ollama-qwen-local"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
Mintplex-Labs/anythingllm-docs
Documentation of AnythingLLM by Mintplex Labs Inc.
undreamai/LLMUnity
Create characters in Unity with LLMs!
bloodworks-io/phlox
Open source, local first AI medical scribe for desktop and web.
snexus/llm-search
Querying local documents, powered by LLM
mamei16/LLM_Web_search
An extension for oobabooga/text-generation-webui that enables the LLM to search the web