microsoft/PyRIT

The Python Risk Identification Tool for generative AI (PyRIT) is an open source framework built to empower security professionals and engineers to proactively identify risks in generative AI systems.

76
/ 100
Verified

PyRIT helps security professionals and engineers proactively find risks in generative AI systems. You input various prompts and attack strategies, and the tool helps you uncover potential vulnerabilities like data leaks or inappropriate content generation. This is designed for security teams and AI engineers responsible for ensuring AI model safety.

3,630 stars. Actively maintained with 94 commits in the last 30 days.

Use this if you need to systematically test your generative AI applications for security vulnerabilities and potential misuse before deployment.

Not ideal if you are looking for a general-purpose AI development framework or a tool for routine AI performance monitoring.

AI Security Red Teaming Generative AI Risk Assessment AI Safety Testing AI Governance
No Package No Dependents
Maintenance 25 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 25 / 25

How are scores calculated?

Stars

3,630

Forks

709

Language

Python

License

MIT

Last pushed

Mar 28, 2026

Commits (30d)

94

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/generative-ai/microsoft/PyRIT"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.