ZhuYun97/GraphCLIP

Official implementation of GraphCLIP: Enhancing Transferability in Graph Foundation Models for Text-Attributed Graphs

26
/ 100
Experimental

This project helps data scientists and researchers improve the accuracy of machine learning models when working with large, interconnected datasets that include text, such as research paper networks or social media graphs. It takes existing text-attributed graphs as input and provides a pre-trained model checkpoint that can be directly applied to new, similar graphs to classify or analyze nodes with higher performance, even without extensive retraining. This is ideal for those who need to quickly adapt powerful graph models across different but related datasets.

No commits in the last 6 months.

Use this if you need to apply advanced graph machine learning to new datasets with text attributes, but don't want to train a complex model from scratch every time.

Not ideal if your datasets do not contain text information associated with their nodes, or if you need to build a model for simple, non-graph structured data.

academic-research social-network-analysis document-classification information-retrieval data-science-modeling
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 8 / 25
Maturity 8 / 25
Community 10 / 25

How are scores calculated?

Stars

67

Forks

6

Language

Python

License

Last pushed

Feb 26, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/ZhuYun97/GraphCLIP"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.