huggingface/optimum-graphcore

Blazing fast training of 🤗 Transformers on Graphcore IPUs

46
/ 100
Emerging

This project helps machine learning engineers or researchers accelerate the training and fine-tuning of large language models and other AI models. It provides tools to efficiently run popular Hugging Face Transformers models on Graphcore Intelligence Processing Units (IPUs), which are specialized AI processors. You input your existing Transformer models and datasets, and it outputs faster-trained or fine-tuned models ready for deployment.

No commits in the last 6 months.

Use this if you are a machine learning engineer working with Hugging Face Transformer models and want to significantly speed up your training and inference workflows using Graphcore IPU hardware.

Not ideal if you are not using Graphcore IPUs, or if your models are not based on the Hugging Face Transformers library.

deep-learning natural-language-processing computer-vision model-training AI-acceleration
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 9 / 25
Maturity 16 / 25
Community 21 / 25

How are scores calculated?

Stars

87

Forks

33

Language

Python

License

Apache-2.0

Last pushed

Mar 11, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/huggingface/optimum-graphcore"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.