Skip to yearly menu bar Skip to main content


Virtual presentation / poster accept

BEEF: Bi-Compatible Class-Incremental Learning via Energy-Based Expansion and Fusion

Fu-Yun Wang · Da-Wei Zhou · Liu Liu · Han-Jia Ye · Yatao Bian · De-Chuan Zhan · Peilin Zhao

Keywords: [ Deep Learning and representational learning ] [ EBMs ] [ Compatibility ] [ continual learning ]


Abstract:

Neural networks suffer from catastrophic forgetting when sequentially learning tasks phase-by-phase, making them inapplicable in dynamically updated systems. Class-incremental learning (CIL) aims to enable neural networks to learn different categories at multi-stages. Recently, dynamic-structure-based CIL methods achieve remarkable performance. However, these methods train all modules in a coupled manner and do not consider possible conflicts among modules, resulting in spoilage of eventual predictions. In this work, we propose a unifying energy-based theory and framework called Bi-Compatible Energy-Based Expansion and Fusion (BEEF) to analyze and achieve the goal of CIL. We demonstrate the possibility of training independent modules in a decoupled manner while achieving bi-directional compatibility among modules through two additionally allocated prototypes, and then integrating them into a unifying classifier with minimal cost. Furthermore, BEEF extends the exemplar-set to a more challenging setting, where exemplars are randomly selected and imbalanced, and maintains its performance when prior methods fail dramatically.Extensive experiments on three widely used benchmarks: CIFAR-100, ImageNet-100, and ImageNet-1000 demonstrate that BEEF achieves state-of-the-art performance in both the ordinary and challenging CIL settings. The Code is available at https://github.com/G-U-N/ICLR23-BEEF.

Chat is not available.