snap-research/MLPInit-for-GNNs
[ICLR 2023] MLPInit: Embarrassingly Simple GNN Training Acceleration with MLP Initialization
Training graph neural networks (GNNs) on extensive datasets can be incredibly slow and resource-intensive, often sacrificing prediction accuracy. This project significantly accelerates GNN training by using a simple pre-training step with a Multi-Layer Perceptron (MLP) to initialize the GNN weights. The outcome is faster GNN training (up to 33x speedup) and often improved prediction accuracy, making it ideal for machine learning researchers and practitioners working with large-scale graph data.
No commits in the last 6 months.
Use this if you are a machine learning researcher or practitioner struggling with the long training times and complexity of applying GNNs to large graph datasets for tasks like node classification or link prediction.
Not ideal if your primary focus is on small graph datasets where GNN training speed is not a significant bottleneck, or if you are not working with GNNs at all.
Stars
76
Forks
3
Language
Jupyter Notebook
License
MIT
Category
Last pushed
Apr 05, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/snap-research/MLPInit-for-GNNs"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
pyg-team/pytorch_geometric
Graph Neural Network Library for PyTorch
a-r-j/graphein
Protein Graph Library
raamana/graynet
Subject-wise networks from structural MRI, both vertex- and voxel-wise features (thickness, GM...
pykale/pykale
Knowledge-Aware machine LEarning (KALE): accessible machine learning from multiple sources for...
dmlc/dgl
Python package built to ease deep learning on graph, on top of existing DL frameworks.