PeterGriffinJin/Patton
Patton: Language Model Pretraining on Text-rich Networks (ACL 2023 main oral)
This is a framework for researchers and practitioners working with complex datasets where entities (like research papers, products, or users) are connected and also have associated text. It takes these 'text-rich networks' as input and produces specialized language models. These models can then be used to perform tasks like classifying entities, finding relevant information, or predicting new connections.
No commits in the last 6 months.
Use this if you need to pretrain a language model to better understand and leverage both the textual content and the relationships within your networked data, such as scientific citation networks or product recommendation graphs.
Not ideal if your data lacks explicit network structures or if your primary goal is to train a general-purpose language model without considering inter-entity relationships.
Stars
32
Forks
2
Language
Python
License
Apache-2.0
Category
Last pushed
Feb 10, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/PeterGriffinJin/Patton"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
RManLuo/reasoning-on-graphs
Official Implementation of ICLR 2024 paper: "Reasoning on Graphs: Faithful and Interpretable...
alibaba/GraphTranslator
GraphTranslator:Aligning Graph Model to Large Language Model for Open-ended Tasks
HKUDS/OpenGraph
[EMNLP'2024] "OpenGraph: Towards Open Graph Foundation Models"
HKUDS/GraphEdit
"GraphEdit: Large Language Models for Graph Structure Learning"
iMoonLab/LLM4Hypergraph
The source code of ICLR 2025 "Beyond Graphs: Can Large Language Models Comprehend Hypergraphs?"