toxic and open-solution-toxic-comments

These are competing implementations of the same Kaggle competition challenge—both attempt to classify toxic comments using similar datasets and evaluation metrics, so users would typically choose one approach over the other rather than use them together.

toxic
49
Emerging
Maintenance 0/25
Adoption 10/25
Maturity 16/25
Community 23/25
Maintenance 0/25
Adoption 10/25
Maturity 16/25
Community 22/25
Stars: 266
Forks: 73
Downloads:
Commits (30d): 0
Language: Python
License: MIT
Stars: 156
Forks: 56
Downloads:
Commits (30d): 0
Language: Python
License: MIT
Stale 6m No Package No Dependents
Archived Stale 6m No Package No Dependents

About toxic

PavelOstyakov/toxic

Toxic Comment Classification Challenge

This tool helps content moderators and online community managers automatically identify and categorize toxic comments. You input raw comment data, and it outputs predictions for different toxicity types. It's designed for anyone needing to efficiently flag harmful language in user-generated content.

content-moderation online-community-management brand-safety user-generated-content social-media-management

About open-solution-toxic-comments

minerva-ml/open-solution-toxic-comments

Open solution to the Toxic Comment Classification Challenge

Scores updated daily from GitHub, PyPI, and npm data. How scores work