optimum and optimum-rbln

Optimum-rbln is a specialized backend plugin for Optimum that enables inference optimization specifically for RBLN NPUs, making them complementary tools used together rather than alternatives.

optimum
77
Verified
optimum-rbln
55
Established
Maintenance 13/25
Adoption 15/25
Maturity 25/25
Community 24/25
Maintenance 10/25
Adoption 6/25
Maturity 25/25
Community 14/25
Stars: 3,325
Forks: 624
Downloads:
Commits (30d): 1
Language: Python
License: Apache-2.0
Stars: 15
Forks: 3
Downloads:
Commits (30d): 0
Language: Python
License: Apache-2.0
No risk flags
No risk flags

About optimum

huggingface/optimum

🚀 Accelerate inference and training of 🤗 Transformers, Diffusers, TIMM and Sentence Transformers with easy to use hardware optimization tools

This tool helps machine learning engineers and researchers accelerate the performance of their large language, image, and sentence models. It takes your existing AI models built with popular frameworks like Hugging Face Transformers or Diffusers and optimizes them for faster execution and training on specialized hardware. The output is a more efficient model that runs quicker and uses fewer resources.

AI model optimization machine learning deployment deep learning training natural language processing computer vision

About optimum-rbln

RBLN-SW/optimum-rbln

⚡ A seamless integration of HuggingFace Transformers & Diffusers with RBLN SDK for efficient inference on RBLN NPUs.

This is a tool for developers working with large language models and image generation models. It allows you to run existing Hugging Face models like Transformers and Diffusers on RBLN Neural Processing Units (NPUs) to achieve faster inference. You input your existing model code and get back a more performant execution on specialized hardware, enabling quicker AI application deployment.

AI-inference-optimization machine-learning-engineering model-deployment natural-language-processing computer-vision

Scores updated daily from GitHub, PyPI, and npm data. How scores work