snap-research/MLPInit-for-GNNs

[ICLR 2023] MLPInit: Embarrassingly Simple GNN Training Acceleration with MLP Initialization

31
/ 100
Emerging

Training graph neural networks (GNNs) on extensive datasets can be incredibly slow and resource-intensive, often sacrificing prediction accuracy. This project significantly accelerates GNN training by using a simple pre-training step with a Multi-Layer Perceptron (MLP) to initialize the GNN weights. The outcome is faster GNN training (up to 33x speedup) and often improved prediction accuracy, making it ideal for machine learning researchers and practitioners working with large-scale graph data.

No commits in the last 6 months.

Use this if you are a machine learning researcher or practitioner struggling with the long training times and complexity of applying GNNs to large graph datasets for tasks like node classification or link prediction.

Not ideal if your primary focus is on small graph datasets where GNN training speed is not a significant bottleneck, or if you are not working with GNNs at all.

graph-neural-networks machine-learning-acceleration large-scale-graph-analysis node-classification link-prediction
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 9 / 25
Maturity 16 / 25
Community 6 / 25

How are scores calculated?

Stars

76

Forks

3

Language

Jupyter Notebook

License

MIT

Last pushed

Apr 05, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/snap-research/MLPInit-for-GNNs"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.