wpeebles/G.pt

Official PyTorch Implementation of "Learning to Learn with Generative Models of Neural Network Checkpoints"

39
/ 100
Emerging

This project helps machine learning researchers efficiently explore and optimize neural network architectures. By inputting an existing neural network's parameters and a target performance (like a desired loss or error rate), it outputs an updated set of parameters that should achieve that target, often in a single step. It's designed for researchers working on model development and optimization tasks, offering a novel way to 'learn to learn' for various neural network types.

345 stars. No commits in the last 6 months.

Use this if you are a machine learning researcher looking for a new method to quickly generate optimized neural network parameters or explore the parameter space of different models.

Not ideal if you are an application developer seeking a ready-to-use library for standard model training or fine-tuning, as this is a research-focused tool for meta-learning.

meta-learning neural-network-optimization model-architecture-search reinforcement-learning-research computer-vision-research
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 13 / 25

How are scores calculated?

Stars

345

Forks

24

Language

Python

License

BSD-2-Clause

Last pushed

Oct 03, 2022

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/wpeebles/G.pt"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.