shyamsn97/hyper-nn
Easy Hypernetworks in Pytorch and Jax
This helps machine learning engineers design and implement hypernetworks, which are neural networks that generate parameters for other neural networks. You provide an existing neural network architecture, and this tool outputs a hypernetwork that can dynamically generate the optimal weights for your original network. It's ideal for those working on complex AI models where dynamic weight generation can lead to more efficient or adaptive solutions.
106 stars. No commits in the last 6 months. Available on PyPI.
Use this if you need to build advanced neural network architectures where one network generates the weights for another, particularly for tasks requiring adaptive or parameter-efficient models.
Not ideal if you are new to deep learning or prefer traditional, fixed-weight neural network designs for simpler applications.
Stars
106
Forks
6
Language
Jupyter Notebook
License
MIT
Category
Last pushed
Jan 27, 2023
Commits (30d)
0
Dependencies
4
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/shyamsn97/hyper-nn"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
explosion/thinc
🔮 A refreshing functional take on deep learning, compatible with your favorite libraries
google-deepmind/optax
Optax is a gradient processing and optimization library for JAX.
patrick-kidger/diffrax
Numerical differential equation solvers in JAX. Autodifferentiable and GPU-capable....
google/grain
Library for reading and processing ML training data.
patrick-kidger/equinox
Elegant easy-to-use neural networks + scientific computing in JAX. https://docs.kidger.site/equinox/