WSU-SEAL/ToxiCR
A supervised learning based tool to identify toxic code review comments
This tool helps software development teams identify potentially harmful or negative language in code review comments. You provide your team's code review comments, and it tells you which ones are toxic. Software project managers, team leads, and developers can use this to foster a more constructive team environment.
No commits in the last 6 months.
Use this if you want to automatically detect and flag toxic language within your team's code review discussions to improve communication and team morale.
Not ideal if you're looking for a general-purpose toxicity detector for broader applications beyond the software engineering domain.
Stars
18
Forks
10
Language
Python
License
GPL-3.0
Category
Last pushed
Sep 15, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/WSU-SEAL/ToxiCR"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
unitaryai/detoxify
Trained models & code to predict toxic comments on all 3 Jigsaw Toxic Comment Challenges. Built...
kensk8er/chicksexer
A Python package for gender classification.
Infinitode/ValX
ValX is an open-source Python package for text cleaning tasks, including profanity detection and...
PavelOstyakov/toxic
Toxic Comment Classification Challenge
minerva-ml/open-solution-toxic-comments
Open solution to the Toxic Comment Classification Challenge