microsoft/PyRIT
The Python Risk Identification Tool for generative AI (PyRIT) is an open source framework built to empower security professionals and engineers to proactively identify risks in generative AI systems.
PyRIT helps security professionals and engineers proactively find risks in generative AI systems. You input various prompts and attack strategies, and the tool helps you uncover potential vulnerabilities like data leaks or inappropriate content generation. This is designed for security teams and AI engineers responsible for ensuring AI model safety.
3,630 stars. Actively maintained with 94 commits in the last 30 days.
Use this if you need to systematically test your generative AI applications for security vulnerabilities and potential misuse before deployment.
Not ideal if you are looking for a general-purpose AI development framework or a tool for routine AI performance monitoring.
Stars
3,630
Forks
709
Language
Python
License
MIT
Category
Last pushed
Mar 28, 2026
Commits (30d)
94
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/generative-ai/microsoft/PyRIT"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related tools
Azure/PyRIT
The Python Risk Identification Tool for generative AI (PyRIT) is an open source framework built...
arsbr/Veritensor
The Anti-Virus for AI Artifacts & RAG Firewall. A static analysis tool scanning Models and...
canada-ca/navigator
Real-time, collaborative, threat modeling tool. / Un outil collaboratif de modélisation des...
ErdemOzgen/RedAiRange
AI Red Teaming Range
alpernae/AIHTTPAnalyzer
AIHTTPAnalyzer revolutionizes web application security testing by bringing artificial...