Geotrend-research/smaller-transformers

Load What You Need: Smaller Multilingual Transformers for Pytorch and TensorFlow 2.0.

39
/ 100
Emerging

This project offers smaller, more efficient versions of multilingual AI models (transformers) that help reduce the computational resources needed for natural language processing tasks. It takes a standard multilingual model and outputs a compact version that still performs well but is faster and cheaper to run. This is ideal for machine learning engineers, data scientists, or MLOps specialists working with multilingual text who need to deploy models on resource-constrained platforms.

105 stars. No commits in the last 6 months.

Use this if you are deploying multilingual AI models and need to reduce their size and memory footprint without significantly compromising accuracy, especially on cloud platforms or edge devices.

Not ideal if your application requires support for a very large number of languages (e.g., over 100 languages) from a single model instance.

natural-language-processing machine-translation text-analysis model-deployment resource-optimization
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 9 / 25
Maturity 16 / 25
Community 14 / 25

How are scores calculated?

Stars

105

Forks

13

Language

Jupyter Notebook

License

Apache-2.0

Last pushed

May 20, 2022

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/Geotrend-research/smaller-transformers"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.