rtk-ai/rtk
CLI proxy that reduces LLM token consumption by 60-90% on common dev commands. Single Rust binary, zero dependencies
This tool helps software developers reduce the cost and improve the efficiency of using AI coding assistants like Claude Code or GitHub Copilot. It works by intelligently filtering and compressing the output of common command-line tools (like `git status` or `ls`) before they are sent to your AI assistant. Developers get more relevant context from their command outputs while significantly lowering the number of tokens consumed by their AI, leading to faster interactions and reduced billing.
6,644 stars. Actively maintained with 218 commits in the last 30 days.
Use this if you are a software developer frequently interacting with AI coding assistants and want to reduce your token usage and get more concise, relevant command output from your terminal.
Not ideal if you rarely use AI coding assistants or prefer to see the full, unfiltered output of every command for debugging purposes.
Stars
6,644
Forks
367
Language
Rust
License
MIT
Category
Last pushed
Mar 13, 2026
Commits (30d)
218
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/rtk-ai/rtk"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Community Discussion
Recent Releases
Related tools
jnsahaj/lumen
Beautiful git diff viewer, generate commits with AI, get summary of changes, all from the CLI
jkawamoto/ctranslate2-rs
Rust bindings for OpenNMT/CTranslate2
Reim-developer/Sephera
Fast Rust CLI for codebase metrics and deterministic LLM context packs
Topos-Labs/infiniloom
High-performance repository context generator for LLMs - Transform codebases into optimized...
mohsen1/yek
A fast Rust based tool to serialize text-based files in a repository or directory for LLM consumption