daiquocnguyen/Graph-Transformer
Universal Graph Transformer Self-Attention Networks (TheWebConf WWW 2022) (Pytorch and Tensorflow)
This project helps data scientists and machine learning engineers analyze complex relationships within connected datasets, like social networks or biological pathways. You input raw graph data (nodes and connections), and it outputs powerful insights into how these entities relate and behave, enabling tasks like predicting user preferences or classifying molecules. It's for professionals working with large, intricate networks who need advanced analytical tools.
680 stars. No commits in the last 6 months.
Use this if you are a data scientist or researcher working with graph-structured data and need to extract advanced insights for classification or prediction tasks.
Not ideal if you primarily work with tabular data or simple datasets without inherent graph structures.
Stars
680
Forks
74
Language
Python
License
Apache-2.0
Category
Last pushed
Aug 16, 2022
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/daiquocnguyen/Graph-Transformer"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
kyzhouhzau/NLPGNN
1. Use BERT, ALBERT and GPT2 as tensorflow2.0's layer. 2. Implement GCN, GAN, GIN and...
IndexFziQ/GNN4NLP-Papers
A list of recent papers about Graph Neural Network methods applied in NLP areas.
qipeng/gcn-over-pruned-trees
Graph Convolution over Pruned Dependency Trees Improves Relation Extraction (authors' PyTorch...
kenqgu/Text-GCN
A PyTorch implementation of "Graph Convolutional Networks for Text Classification." (AAAI 2019)
Cynwell/Text-Level-GNN
Text Level Graph Neural Network for Text Classification