Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Deep Generative Model in Machine Learning: Theory, Principle and Efficacy

GRAPH GENERATIVE PRE-TRAINED TRANSFORMER

Xiaohui Chen · Yinkai Wang · Jiaxing He · Yuanqi Du · Soha Hassoun · Xiaolin Xu · Liping Liu

Keywords: [ Graph generation ] [ GPT ] [ Foundation Models ]


Abstract:

Graph generation is an essential task across various domains, such as molecular design and social network analysis, as it enables the modeling of complex relationships and structured data. While many modern graph generative models rely on adjacency matrices, this work revisits an approach that represents graphs as sequences of node and edge sets. We argue that this method offers more efficient graph encoding and then devise a method representing graphs as token sequences. Leveraging this representation, we present the Graph Generative Pre-trained Transformer (G2PT), an auto-regressive model designed to learn graph structures through next-token prediction. To extend G2PT's utility as a general-purpose foundation model, we explore fine-tuning techniques for two downstream tasks: goal-oriented generation and graph property prediction. Comprehensive experiments across multiple datasets demonstrate that G2PT delivers state-of-the-art generative performance on both generic graph and molecular datasets. Moreover, G2PT showcases strong adaptability and versatility in downstream applications, ranging from molecular design to property prediction.

Chat is not available.