Geotrend-research/smaller-transformers
Load What You Need: Smaller Multilingual Transformers for Pytorch and TensorFlow 2.0.
This project offers smaller, more efficient versions of multilingual AI models (transformers) that help reduce the computational resources needed for natural language processing tasks. It takes a standard multilingual model and outputs a compact version that still performs well but is faster and cheaper to run. This is ideal for machine learning engineers, data scientists, or MLOps specialists working with multilingual text who need to deploy models on resource-constrained platforms.
105 stars. No commits in the last 6 months.
Use this if you are deploying multilingual AI models and need to reduce their size and memory footprint without significantly compromising accuracy, especially on cloud platforms or edge devices.
Not ideal if your application requires support for a very large number of languages (e.g., over 100 languages) from a single model instance.
Stars
105
Forks
13
Language
Jupyter Notebook
License
Apache-2.0
Category
Last pushed
May 20, 2022
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/Geotrend-research/smaller-transformers"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
lucidrains/x-transformers
A concise but complete full-attention transformer with a set of promising experimental features...
kanishkamisra/minicons
Utility for behavioral and representational analyses of Language Models
lucidrains/simple-hierarchical-transformer
Experiments around a simple idea for inducing multiple hierarchical predictive model within a GPT
lucidrains/dreamer4
Implementation of Danijar's latest iteration for his Dreamer line of work
Nicolepcx/Transformers-in-Action
This is the corresponding code for the book Transformers in Action