iagooar/qqqa
Fast, stateless LLM for your shell: qq answers; qa runs commands
This tool helps command-line users quickly get answers to questions and automate tasks using an AI assistant. You type a question or a command, and the AI provides a response or executes a task. It's designed for developers, system administrators, or anyone who frequently uses the terminal for their work.
609 stars.
Use this if you want fast, on-demand AI assistance directly within your terminal for questions or to automate simple command-line tasks.
Not ideal if you need a long, interactive conversation with memory or a complex multi-step AI agent that doesn't require explicit confirmation for actions.
Stars
609
Forks
18
Language
Rust
License
MIT
Category
Last pushed
Nov 30, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/iagooar/qqqa"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
rtk-ai/rtk
CLI proxy that reduces LLM token consumption by 60-90% on common dev commands. Single Rust...
jnsahaj/lumen
Beautiful git diff viewer, generate commits with AI, get summary of changes, all from the CLI
jkawamoto/ctranslate2-rs
Rust bindings for OpenNMT/CTranslate2
Reim-developer/Sephera
Fast Rust CLI for codebase metrics and deterministic LLM context packs
Topos-Labs/infiniloom
High-performance repository context generator for LLMs - Transform codebases into optimized...