bloomberg/minilmv2.bb

Our open source implementation of MiniLMv2 (https://aclanthology.org/2021.findings-acl.188)

33
/ 100
Emerging

This project helps machine learning engineers and researchers reduce the size and computational cost of large language models while maintaining their performance. It takes pre-trained large language models (teacher models) and the data they were trained on as input. The output is a smaller, more efficient 'student' language model that can be deployed more easily.

No commits in the last 6 months.

Use this if you need to deploy powerful transformer-based language models in resource-constrained environments or accelerate inference times for natural language processing tasks.

Not ideal if you are looking for a pre-trained, ready-to-use small language model without needing to perform a distillation process yourself.

natural-language-processing machine-learning-deployment model-optimization transformer-models
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 8 / 25
Maturity 16 / 25
Community 9 / 25

How are scores calculated?

Stars

61

Forks

5

Language

Python

License

Apache-2.0

Last pushed

Jun 12, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/bloomberg/minilmv2.bb"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.