One for Two: A Unified Framework for Imbalanced Graph Classification via Dynamic Balanced Prototype
Abstract
Graph Neural Networks (GNNs) have advanced graph classification, yet they remain vulnerable to graph-level imbalance, encompassing class imbalance and topological imbalance. To address both types of imbalance in a unified manner, we propose UniImb, a novel framework for imbalanced graph classification. Specifically, UniImb first captures multi-scale topological features and enhances data diversity via learnable personalized graph perturbations. It then employs a dynamic balanced prototype module to learn representative prototypes from graph instances, improving the quality of graph representations. Concurrently, a prototype load-balancing optimization term mitigates dominance by majority samples to equalize sample influence during training. We justify these design choices theoretically using the Information Bottleneck principle. We justify these design choices theoretically using the Information Bottleneck principle. Extensive experiments on 19 datasets and 23 baselines demonstrate that UniImb has achieved dominant performance across various imbalanced scenarios. Our code is available at Anonymous GitHub.