graphcore-research/tessellate-ipu
TessellateIPU: low level Poplar tile programming from Python
This library helps machine learning researchers and developers, particularly those working with JAX, gain fine-grained control over how their algorithms run on Graphcore IPU hardware. It allows you to specify exactly how data arrays are distributed and processed across the IPU's tiles. You provide your JAX code and data, and it outputs results optimized for IPU's unique architecture, enabling more efficient and customized algorithm execution.
No commits in the last 6 months.
Use this if you are a machine learning researcher or developer using JAX and need to optimize your algorithms by manually controlling data placement and operations on Graphcore IPU hardware tiles.
Not ideal if you are a practitioner looking for a high-level, automated solution for general ML tasks without needing to interact with low-level hardware specifics or if you are not using Graphcore IPUs.
Stars
13
Forks
—
Language
Python
License
Apache-2.0
Category
Last pushed
Mar 12, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/graphcore-research/tessellate-ipu"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
pymc-devs/pytensor
PyTensor allows you to define, optimize, and efficiently evaluate mathematical expressions...
arogozhnikov/einops
Flexible and powerful tensor operations for readable and reliable code (for pytorch, jax, TF and others)
lava-nc/lava-dl
Deep Learning library for Lava
tensorly/tensorly
TensorLy: Tensor Learning in Python.
tensorpack/tensorpack
A Neural Net Training Interface on TensorFlow, with focus on speed + flexibility