Awesome-LLM-Red-Teaming and awesome-llm-security
These are competitors, as both are curated lists of resources for the general topic of LLM security, with one specifically focused on red teaming.
About Awesome-LLM-Red-Teaming
user1342/Awesome-LLM-Red-Teaming
A curated list of awesome LLM Red Teaming training, resources, and tools.
This resource helps security researchers, AI developers, and auditors identify and exploit vulnerabilities in large language models (LLMs). It provides a curated collection of tools, guides, and research for conducting red-teaming exercises. You'll find resources ranging from practice environments for prompt injection to advanced frameworks for automated adversarial testing, allowing you to expose weaknesses in LLM security and alignment.
About awesome-llm-security
beyefendi/awesome-llm-security
Awesome LLM security tools, research, and documents
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work