Poster
in
Workshop: SCOPE: SCALABLE OPTIMIZATION FOR EFFICIENT AND ADPATIVE FOUNDATION MODELS
Graph Low-Rank Adapters of High Regularity for Graph Neural Networks and Graph Transformers
Pantelis Papageorgiou · Haitz Sáez de Ocáriz Borde · Anastasis Kratsios · Michael Bronstein
Keywords: [ Low-Rank Adapters ] [ Graph Neural Networks ] [ Fine-tuning ]
Abstract:
We introduce a new low-rank graph adapter, GConv-Adapter, that leverages a two-fold normalized graph convolution and trainable low-rank weight matrices to achieve state-of-the-art (SOTA) and near-SOTA performance in GNN fine-tuning for standard message-passing neural networks (MPNNs) and graph transformers (GTs) in both inductive and transductive learning. We motivate our design by deriving an upper bound on the adapter's Lipschitz constant for $\delta$-regular random (expander) graphs, and we compare it against previous methods which we show to be unbounded.
Chat is not available.
Successful Page Load