ThorneShadowbane/ai-code-guard
Detect security vulnerabilities in AI-generated code
This tool helps software development teams ensure the code generated by AI assistants (like GitHub Copilot or ChatGPT) is secure before it goes into production. It takes your project's codebase as input and identifies potential security flaws, outputting a clear report of vulnerabilities like exposed secrets, injection risks, or AI-specific issues. It's designed for developers, security engineers, or dev leads who integrate AI coding tools into their workflow.
Use this if your team uses AI coding assistants and you need an automated way to detect security vulnerabilities that traditional scanners might miss, especially those unique to AI-generated code.
Not ideal if you're looking for a general-purpose security scanner for human-written code only, or if your team doesn't use AI for code generation at all.
Stars
11
Forks
1
Language
Python
License
MIT
Category
Last pushed
Jan 16, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/ThorneShadowbane/ai-code-guard"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
lintsinghua/DeepAudit
DeepAudit:人人拥有的 AI 黑客战队,让漏洞挖掘触手可及。国内首个开源的代码漏洞挖掘多智能体系统。小白一键部署运行,自主协作审计 + 自动化沙箱 PoC 验证。支持 Ollama...
usestrix/strix
Open-source AI hackers to find and fix your app’s vulnerabilities.
WuliRuler/AutorizePro
🧿 AutorizePro是一款强大越权检测 Burp 插件,通过增加 AI 辅助分析 && 进一步优化检测逻辑,大幅降低误报率,提升越权漏洞检出效率。 [ AutorizePro is...
venslabs/vens
Context-Aware Vulnerability Risk Scoring
Aakashbhardwaj27/ai-scanner
A powerful CLI tool that scans your codebase to detect LLM SDK usage, AI framework integrations,...