shyamsn97/hyper-nn

Easy Hypernetworks in Pytorch and Jax

43
/ 100
Emerging

This helps machine learning engineers design and implement hypernetworks, which are neural networks that generate parameters for other neural networks. You provide an existing neural network architecture, and this tool outputs a hypernetwork that can dynamically generate the optimal weights for your original network. It's ideal for those working on complex AI models where dynamic weight generation can lead to more efficient or adaptive solutions.

106 stars. No commits in the last 6 months. Available on PyPI.

Use this if you need to build advanced neural network architectures where one network generates the weights for another, particularly for tasks requiring adaptive or parameter-efficient models.

Not ideal if you are new to deep learning or prefer traditional, fixed-weight neural network designs for simpler applications.

deep-learning neural-network-design model-optimization adaptive-ai machine-learning-engineering
Stale 6m
Maintenance 0 / 25
Adoption 9 / 25
Maturity 25 / 25
Community 9 / 25

How are scores calculated?

Stars

106

Forks

6

Language

Jupyter Notebook

License

MIT

Last pushed

Jan 27, 2023

Commits (30d)

0

Dependencies

4

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/shyamsn97/hyper-nn"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.