optimum and optimum-intel

Optimum Intel is a specialized backend/extension within the broader Optimum ecosystem that provides Intel-specific optimization implementations (like OpenVINO and Neural Engine support) for the general-purpose Optimum library, making them complements designed to be used together.

optimum
77
Verified
optimum-intel
68
Established
Maintenance 13/25
Adoption 15/25
Maturity 25/25
Community 24/25
Maintenance 17/25
Adoption 10/25
Maturity 16/25
Community 25/25
Stars: 3,325
Forks: 624
Downloads:
Commits (30d): 1
Language: Python
License: Apache-2.0
Stars: 548
Forks: 205
Downloads:
Commits (30d): 14
Language: Jupyter Notebook
License: Apache-2.0
No risk flags
No Package No Dependents

About optimum

huggingface/optimum

🚀 Accelerate inference and training of 🤗 Transformers, Diffusers, TIMM and Sentence Transformers with easy to use hardware optimization tools

This tool helps machine learning engineers and researchers accelerate the performance of their large language, image, and sentence models. It takes your existing AI models built with popular frameworks like Hugging Face Transformers or Diffusers and optimizes them for faster execution and training on specialized hardware. The output is a more efficient model that runs quicker and uses fewer resources.

AI model optimization machine learning deployment deep learning training natural language processing computer vision

About optimum-intel

huggingface/optimum-intel

🤗 Optimum Intel: Accelerate inference with Intel optimization tools

This is a tool for developers working with AI models on Intel hardware. It helps take large language models (LLMs) or other deep learning models from libraries like Transformers or Diffusers, optimize them using Intel's OpenVINO toolkit, and prepare them for faster deployment. Developers use it to make their AI applications run more efficiently on Intel CPUs, GPUs, and other accelerators.

AI-model-deployment deep-learning-optimization machine-learning-engineering model-quantization edge-AI

Scores updated daily from GitHub, PyPI, and npm data. How scores work