graphcore-research/jax-experimental
JAX for Graphcore IPU (experimental)
This project helps machine learning researchers and AI developers accelerate their experimental deep learning models. It enables running JAX models on Graphcore IPU hardware, allowing for faster computations. You provide your JAX code, and it compiles and executes it on the IPU, returning the computed output, making it ideal for those exploring new model architectures or algorithms.
No commits in the last 6 months.
Use this if you are a machine learning researcher or AI developer experimenting with JAX and want to evaluate its performance on Graphcore IPU hardware.
Not ideal if you need a production-ready solution for deploying or training models, as this is an experimental research project.
Stars
22
Forks
2
Language
Python
License
Apache-2.0
Category
Last pushed
Mar 12, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/graphcore-research/jax-experimental"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
explosion/thinc
🔮 A refreshing functional take on deep learning, compatible with your favorite libraries
google-deepmind/optax
Optax is a gradient processing and optimization library for JAX.
patrick-kidger/diffrax
Numerical differential equation solvers in JAX. Autodifferentiable and GPU-capable....
google/grain
Library for reading and processing ML training data.
patrick-kidger/equinox
Elegant easy-to-use neural networks + scientific computing in JAX. https://docs.kidger.site/equinox/