alibaba/graph-gpt
Generative Pre-trained Graph Eulerian Transformer [ICML2025]
This project helps data scientists and machine learning engineers analyze and generate graph-structured data more effectively. It takes various graph datasets as input and outputs advanced graph foundation models that can predict properties or generate new graph structures. It is ideal for researchers and practitioners working with complex network data like social networks, molecular structures, or citation graphs.
101 stars.
Use this if you need to build highly accurate predictive models or generative models for large-scale graph data, especially if you're exploring diffusion-based approaches.
Not ideal if your graphs are very small or simple, or if you need a solution for non-graph data types.
Stars
101
Forks
8
Language
Python
License
MIT
Category
Last pushed
Dec 23, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/alibaba/graph-gpt"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
Nixtla/nixtla
TimeGPT-1: production ready pre-trained Time Series Foundation Model for forecasting and...
andrewdalpino/NoPE-GPT
A GPT-style small language model (SLM) with no positional embeddings (NoPE).
sigdelsanjog/gptmed
pip install gptmed
akanyaani/gpt-2-tensorflow2.0
OpenAI GPT2 pre-training and sequence prediction implementation in Tensorflow 2.0
samkamau81/FinGPT_
FinGPT is an AI language model designed to understand and generate financial content. Built upon...