TimRoith/BregmanLearning

Optimizing neural networks via an inverse scale space flow.

38
/ 100
Emerging

This project helps machine learning researchers and practitioners train more efficient neural networks. It takes your existing neural network architecture and training data, and outputs a sparse neural network that uses fewer parameters while maintaining performance. This is ideal for those who need to deploy models with reduced computational resources or improve training speed.

No commits in the last 6 months.

Use this if you need to train sparse neural networks with fewer parameters, making them more efficient for deployment or faster training.

Not ideal if you are looking for a pre-trained model or a tool for general machine learning tasks beyond sparse network optimization.

neural-network-optimization model-compression deep-learning-efficiency sparse-models machine-learning-research
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 6 / 25
Maturity 16 / 25
Community 16 / 25

How are scores calculated?

Stars

18

Forks

6

Language

Jupyter Notebook

License

MIT

Last pushed

Jan 30, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/TimRoith/BregmanLearning"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.