AntoAndGar/Intrinsic-Dimension
Code implementing the paper "Measuring intrinsic dimension of objective landscapes" to perform some research
This project helps machine learning researchers understand the complexity of optimization problems in neural networks. It takes pre-trained neural network models (like Fully Connected or LeNet architectures) and outputs a quantitative measure called 'intrinsic dimension' for their objective landscapes. This measure indicates how many important parameters truly influence the model's performance, helping researchers identify more efficient ways to train or simplify models. The primary users are researchers in deep learning and optimization.
No commits in the last 6 months.
Use this if you are a machine learning researcher interested in the theoretical aspects of neural network optimization and want to measure the intrinsic dimension of your model's objective landscape.
Not ideal if you are looking for a practical tool to directly improve model training speed or accuracy without delving into the underlying mathematical properties of the optimization landscape.
Stars
8
Forks
1
Language
Jupyter Notebook
License
—
Category
Last pushed
Jan 26, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/AntoAndGar/Intrinsic-Dimension"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
MaximeVandegar/Papers-in-100-Lines-of-Code
Implementation of papers in 100 lines of code.
kk7nc/RMDL
RMDL: Random Multimodel Deep Learning for Classification
OML-Team/open-metric-learning
Metric learning and retrieval pipelines, models and zoo.
miguelvr/dropblock
Implementation of DropBlock: A regularization method for convolutional networks in PyTorch.
PaddlePaddle/models
Officially maintained, supported by PaddlePaddle, including CV, NLP, Speech, Rec, TS, big models...