YerbaPage/MGDebugger

Multi-Granularity LLM Debugger [ICSE2026]

39
/ 100
Emerging

This tool helps developers debug code generated by Large Language Models (LLMs) by systematically identifying and fixing errors. You provide the LLM-generated code, and it pinpoints issues at different levels, from individual functions to the whole program. This is designed for software developers who are integrating LLMs into their code generation workflows.

No commits in the last 6 months.

Use this if you are a software developer frequently working with LLMs for code generation and need a reliable way to automatically debug and improve the correctness of the generated code.

Not ideal if you are debugging human-written code or do not use Large Language Models for code generation.

LLM-generated code software debugging code generation developer tools AI application development
Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 9 / 25
Maturity 16 / 25
Community 12 / 25

How are scores calculated?

Stars

96

Forks

10

Language

Python

License

MIT

Last pushed

Jul 06, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/YerbaPage/MGDebugger"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.