AntoAndGar/Intrinsic-Dimension

Code implementing the paper "Measuring intrinsic dimension of objective landscapes" to perform some research

20
/ 100
Experimental

This project helps machine learning researchers understand the complexity of optimization problems in neural networks. It takes pre-trained neural network models (like Fully Connected or LeNet architectures) and outputs a quantitative measure called 'intrinsic dimension' for their objective landscapes. This measure indicates how many important parameters truly influence the model's performance, helping researchers identify more efficient ways to train or simplify models. The primary users are researchers in deep learning and optimization.

No commits in the last 6 months.

Use this if you are a machine learning researcher interested in the theoretical aspects of neural network optimization and want to measure the intrinsic dimension of your model's objective landscape.

Not ideal if you are looking for a practical tool to directly improve model training speed or accuracy without delving into the underlying mathematical properties of the optimization landscape.

deep-learning-research neural-network-optimization computational-complexity model-analysis machine-learning-theory
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 4 / 25
Maturity 8 / 25
Community 8 / 25

How are scores calculated?

Stars

8

Forks

1

Language

Jupyter Notebook

License

Last pushed

Jan 26, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/AntoAndGar/Intrinsic-Dimension"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.