huggingface/optimum-graphcore
Blazing fast training of 🤗 Transformers on Graphcore IPUs
This project helps machine learning engineers or researchers accelerate the training and fine-tuning of large language models and other AI models. It provides tools to efficiently run popular Hugging Face Transformers models on Graphcore Intelligence Processing Units (IPUs), which are specialized AI processors. You input your existing Transformer models and datasets, and it outputs faster-trained or fine-tuned models ready for deployment.
No commits in the last 6 months.
Use this if you are a machine learning engineer working with Hugging Face Transformer models and want to significantly speed up your training and inference workflows using Graphcore IPU hardware.
Not ideal if you are not using Graphcore IPUs, or if your models are not based on the Hugging Face Transformers library.
Stars
87
Forks
33
Language
Python
License
Apache-2.0
Category
Last pushed
Mar 11, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/huggingface/optimum-graphcore"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
openvinotoolkit/nncf
Neural Network Compression Framework for enhanced OpenVINOâ„¢ inference
huggingface/optimum
🚀 Accelerate inference and training of 🤗 Transformers, Diffusers, TIMM and Sentence Transformers...
NVIDIA/Megatron-LM
Ongoing research training transformer models at scale
huggingface/optimum-intel
🤗 Optimum Intel: Accelerate inference with Intel optimization tools
eole-nlp/eole
Open language modeling toolkit based on PyTorch