LexiestLeszek/web-search-ollama-qwen-local

Local LLM Web search using qwen model and Ollama

29
/ 100
Experimental

This tool helps you perform web searches and get summarized answers without sending your queries or information to external services. You provide a question or topic, and it fetches relevant web content to give you a concise, direct answer. This is ideal for anyone who needs to quickly find information from the web while maintaining strict privacy and data control, such as researchers, security-conscious individuals, or small businesses.

No commits in the last 6 months.

Use this if you need to perform web searches and generate summarized answers using a large language model that runs entirely on your own computer, ensuring your data remains private.

Not ideal if you need the absolute latest real-time information, require complex, multi-turn conversational AI interactions, or don't have the local computing resources to run a large language model.

private-search local-information-retrieval data-privacy offline-knowledge-base secure-research
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 6 / 25
Maturity 8 / 25
Community 15 / 25

How are scores calculated?

Stars

15

Forks

5

Language

Python

License

Last pushed

Feb 09, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/rag/LexiestLeszek/web-search-ollama-qwen-local"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.