shenxiangzhuang/bleuscore
BLEU Score in Rust
This tool helps machine translation researchers and practitioners quickly evaluate the quality of translated text. It takes a list of machine-generated translations and one or more human-generated reference translations, then outputs a BLEU score indicating how well the machine translation matches the references. This is ideal for those working on large-scale natural language processing projects, especially in machine translation.
Available on PyPI.
Use this if you need a significantly faster way to calculate BLEU scores for large volumes of machine translation outputs, particularly when working with Python.
Not ideal if you are evaluating only a small number of translations or if you require a different metric beyond BLEU.
Stars
12
Forks
1
Language
Rust
License
MIT
Category
Last pushed
Mar 01, 2026
Monthly downloads
18
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/shenxiangzhuang/bleuscore"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
SomeB1oody/RustyML
A high-performance machine learning library in pure Rust, offering statistical utilities, ML...
smartcorelib/smartcore
A comprehensive library for machine learning and numerical computing. Apply Machine Learning...
open-spaced-repetition/fsrs-rs
FSRS for Rust, including Optimizer and Scheduler
open-spaced-repetition/fsrs-optimizer
FSRS Optimizer Package
rust-ml/linfa
A Rust machine learning framework.