articlequality and editquality

These are ecosystem siblings, both from Wikimedia, representing distinct but related efforts to assess the quality of different types of contributions to their platforms: one for the quality of articles and the other for the quality of individual edits.

articlequality
55
Established
editquality
52
Established
Maintenance 2/25
Adoption 8/25
Maturity 25/25
Community 20/25
Maintenance 0/25
Adoption 7/25
Maturity 25/25
Community 20/25
Stars: 50
Forks: 30
Downloads:
Commits (30d): 0
Language: Python
License: MIT
Stars: 37
Forks: 23
Downloads:
Commits (30d): 0
Language: Python
License: MIT
Stale 6m
Stale 6m No Dependents

About articlequality

wikimedia/articlequality

Github mirror - our actual code is hosted with Gerrit (please see https://www.mediawiki.org/wiki/Developer_access for contributing)

This library helps Wikipedia editors automatically assess the quality of articles. You provide the text content of a Wikipedia page, and it returns a prediction of its quality class (e.g., 'stub', 'b', 'fa') along with the probabilities for each class. It's designed for Wikipedia editors or researchers who need to categorize articles efficiently.

Wikipedia editing content assessment article quality encyclopedia management text categorization

About editquality

wikimedia/editquality

Github mirror - our actual code is hosted with Gerrit (please see https://www.mediawiki.org/wiki/Developer_access for contributing)

This library helps identify potentially problematic edits on Wikimedia projects like Wikipedia. It takes recent changes or proposed edits as input and uses machine learning models to predict their quality or likelihood of being vandalism. This helps human editors and community managers prioritize reviews and maintain content integrity.

content-moderation community-management vandalism-detection wikimedia-editing edit-review

Scores updated daily from GitHub, PyPI, and npm data. How scores work