krishnap25/mauve

Package to compute Mauve, a similarity score between neural text and human text. Install with `pip install mauve-text`.

41
/ 100
Emerging

This tool helps researchers and practitioners evaluate how similar machine-generated text is to human-written text. You provide samples of text created by an AI model and samples of human-authored text. It then calculates a 'MAUVE score,' which indicates the degree of similarity, helping you understand the quality and naturalness of the AI's output. This is useful for anyone developing or assessing AI text generation systems, like those in natural language processing research or content creation.

308 stars. No commits in the last 6 months.

Use this if you need an objective and quantifiable way to compare the distribution and quality of AI-generated text against human text, to understand if your model's output is realistic and diverse.

Not ideal if you only need a qualitative assessment or subjective human feedback for text quality, or if you are working with very small text samples (less than a few thousand generations).

AI-text-evaluation Natural-Language-Generation Content-Quality-Assessment Machine-Translation-Evaluation Text-Similarity-Analysis
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 15 / 25

How are scores calculated?

Stars

308

Forks

28

Language

Python

License

Last pushed

Jul 12, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/krishnap25/mauve"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.