huggingface/optimum

๐Ÿš€ Accelerate inference and training of ๐Ÿค— Transformers, Diffusers, TIMM and Sentence Transformers with easy to use hardware optimization tools

77
/ 100
Verified

This tool helps machine learning engineers and researchers accelerate the performance of their large language, image, and sentence models. It takes your existing AI models built with popular frameworks like Hugging Face Transformers or Diffusers and optimizes them for faster execution and training on specialized hardware. The output is a more efficient model that runs quicker and uses fewer resources.

3,325 stars. Used by 29 other packages. Actively maintained with 1 commit in the last 30 days. Available on PyPI.

Use this if you need to make your AI models run faster and more efficiently on specific hardware, whether for training or deploying them for real-world use.

Not ideal if you are a general user looking for a pre-built AI application, as this tool is for optimizing existing models rather than creating new ones.

AI model optimization machine learning deployment deep learning training natural language processing computer vision
Maintenance 13 / 25
Adoption 15 / 25
Maturity 25 / 25
Community 24 / 25

How are scores calculated?

Stars

3,325

Forks

624

Language

Python

License

Apache-2.0

Last pushed

Mar 12, 2026

Commits (30d)

1

Dependencies

5

Reverse dependents

29

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/huggingface/optimum"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.