TimRoith/BregmanLearning
Optimizing neural networks via an inverse scale space flow.
This project helps machine learning researchers and practitioners train more efficient neural networks. It takes your existing neural network architecture and training data, and outputs a sparse neural network that uses fewer parameters while maintaining performance. This is ideal for those who need to deploy models with reduced computational resources or improve training speed.
No commits in the last 6 months.
Use this if you need to train sparse neural networks with fewer parameters, making them more efficient for deployment or faster training.
Not ideal if you are looking for a pre-trained model or a tool for general machine learning tasks beyond sparse network optimization.
Stars
18
Forks
6
Language
Jupyter Notebook
License
MIT
Category
Last pushed
Jan 30, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/TimRoith/BregmanLearning"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
pykt-team/pykt-toolkit
pyKT: A Python Library to Benchmark Deep Learning based Knowledge Tracing Models
microsoft/archai
Accelerate your Neural Architecture Search (NAS) through fast, reproducible and modular research.
google-research/morph-net
Fast & Simple Resource-Constrained Learning of Deep Network Structure
AI-team-UoA/pyJedAI
An open-source library that leverages Python’s data science ecosystem to build powerful...
IDEALLab/EngiBench
Benchmarks for automated engineering design