huytd/txtask
Ask Ollama about your local text files
This tool allows you to have a conversation with your own local text documents. You provide a folder of text files, and it lets you ask questions about their content. It's designed for anyone who needs to quickly extract information or insights from their personal collection of documents without needing to manually read through everything.
No commits in the last 6 months.
Use this if you need to quickly find answers or get summaries from a collection of your own text documents like notes, articles, or reports.
Not ideal if you're working with very large document archives that require persistent, scalable search, or if you need to query live web content.
Stars
38
Forks
5
Language
Rust
License
BSD-3-Clause
Category
Last pushed
Dec 30, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/huytd/txtask"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
dtnewman/zev
A simple CLI tool to help you remember terminal commands
hecrj/icebreaker
A local AI chat app powered by 🦀 Rust, 🧊 iced, 🤗 Hugging Face, and 🦙 llama.cpp
adammpkins/llama-terminal-completion
AI terminal assistant for any OpenAI-compatible API. Features interactive chat TUI, command...
milosgajdos/bot-banter
Go vs Rust AI bot voice conversation
dustinblackman/oatmeal
Terminal UI to chat with large language models (LLM) using different model backends, and...