WSU-SEAL/ToxiCR

A supervised learning based tool to identify toxic code review comments

41
/ 100
Emerging

This tool helps software development teams identify potentially harmful or negative language in code review comments. You provide your team's code review comments, and it tells you which ones are toxic. Software project managers, team leads, and developers can use this to foster a more constructive team environment.

No commits in the last 6 months.

Use this if you want to automatically detect and flag toxic language within your team's code review discussions to improve communication and team morale.

Not ideal if you're looking for a general-purpose toxicity detector for broader applications beyond the software engineering domain.

code-review team-management software-development developer-relations communication-analysis
Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 6 / 25
Maturity 16 / 25
Community 17 / 25

How are scores calculated?

Stars

18

Forks

10

Language

Python

License

GPL-3.0

Last pushed

Sep 15, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/nlp/WSU-SEAL/ToxiCR"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.