brianwestphal/glassbox
A local tool for humans to review AI generated code
This tool helps software developers efficiently review code generated by AI. It takes the AI's proposed code changes and displays them in a clear, interactive format. Developers can then precisely annotate specific lines with structured feedback categories like 'Bug' or 'Style', and this feedback is exported in a format AI tools can read to make corrections. The output guides the AI to improve its code, creating a tight feedback loop.
Available on npm.
Use this if you are a software developer frequently using AI coding tools and need a dedicated way to review their output, provide structured feedback, and iterate quickly without manual rewrites.
Not ideal if you primarily review code written by human teammates or if your AI coding workflow doesn't involve iterating on structured feedback.
Stars
23
Forks
1
Language
TypeScript
License
MIT
Category
Last pushed
Mar 17, 2026
Commits (30d)
0
Dependencies
4
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ai-coding/brianwestphal/glassbox"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
Aider-AI/aider
aider is AI pair programming in your terminal
robinebers/openusage
Burning through your subscriptions too fast? Paying for stuff you never use? Stop guessing....
BA-CalderonMorales/terminal-jarvis
In the midst of all the tools out there that you can possibly use to keep track of them. Here's...
tomlin7/biscuit
biscuit is a fast, extensible, native code editor with agents. lightweight <20 mb in size....
dliedke/ClaudeCodeExtension
A Visual Studio .NET extension that provides a better interface for Claude Code CLI, OpenAI...