BorgwardtLab/ggme

Official repository for the ICLR 2022 paper "Evaluation Metrics for Graph Generative Models: Problems, Pitfalls, and Practical Solutions" https://openreview.net/forum?id=tBtoZYKd9n

21
/ 100
Experimental

This tool helps machine learning researchers evaluate the performance of different graph generative models. It takes two sets of graphs – one generated by a model and one real-world dataset – and quantifies how similar they are. This allows researchers to understand which models best capture the characteristics of real-world graphs.

No commits in the last 6 months.

Use this if you are developing or comparing machine learning models that generate graphs and need to rigorously assess their quality.

Not ideal if you are looking for a general-purpose graph analysis tool or don't work with graph generative models.

machine-learning-research graph-neural-networks generative-models model-evaluation computational-science
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 5 / 25
Maturity 16 / 25
Community 0 / 25

How are scores calculated?

Stars

14

Forks

Language

Python

License

BSD-3-Clause

Last pushed

Jul 05, 2022

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/BorgwardtLab/ggme"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.