Kaixhin/nninit
Weight initialisation schemes for Torch7 neural network modules
When building neural networks using Torch7, properly setting the starting values for your network's connections (weights) and biases is crucial for effective training. This tool helps you apply various popular and advanced initialization strategies to these parameters, allowing you to easily configure your network for better performance. It's intended for machine learning practitioners and researchers working with neural networks in Torch7.
100 stars. No commits in the last 6 months.
Use this if you need fine-grained control over how the weights and biases of your Torch7 neural network modules are initialized, going beyond basic random assignments.
Not ideal if you are not using Torch7 for your neural network development or if you prefer automatic initialization without specific control.
Stars
100
Forks
13
Language
Lua
License
MIT
Last pushed
Jun 21, 2017
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/Kaixhin/nninit"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
Naresh1318/Adversarial_Autoencoder
A wizard's guide to Adversarial Autoencoders
mseitzer/pytorch-fid
Compute FID scores with PyTorch.
acids-ircam/RAVE
Official implementation of the RAVE model: a Realtime Audio Variational autoEncoder
ratschlab/aestetik
AESTETIK: Convolutional autoencoder for learning spot representations from spatial...
jaanli/variational-autoencoder
Variational autoencoder implemented in tensorflow and pytorch (including inverse autoregressive flow)