snap-research/graphless-neural-networks

[ICLR 2022] Code for Graph-less Neural Networks: Teaching Old MLPs New Tricks via Distillation (GLNN)

44
/ 100
Emerging

This project offers a way to classify items in interconnected datasets, like academic papers linked by citations or products linked by co-purchases. It takes your existing connected data and, through a process called distillation, outputs a simpler model that can classify new items almost as accurately as complex models, but much faster. Data scientists and machine learning engineers who work with graph-structured data will find this useful for deploying efficient classification systems.

No commits in the last 6 months.

Use this if you need to build a machine learning model that classifies items in a network (like documents or products) and you prioritize faster prediction speeds without sacrificing much accuracy.

Not ideal if you primarily need to develop entirely new graph neural network architectures from scratch or if maximum classification accuracy at any computational cost is your absolute top priority.

network-analysis node-classification graph-data machine-learning-operations model-optimization
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 9 / 25
Maturity 16 / 25
Community 19 / 25

How are scores calculated?

Stars

95

Forks

21

Language

Python

License

MIT

Last pushed

Oct 22, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/snap-research/graphless-neural-networks"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.