Skip to yearly menu bar Skip to main content


Virtual presentation / poster accept

DynaMS: Dyanmic Margin Selection for Efficient Deep Learning

Jiaxing Wang · Yong Li · Jingwei Zhuo · Xupeng Shi · WEIZHONG ZHANG · Lixing Gong · Tong Tao · pengzhang liu · Yongjun Bao · Weipeng Yan

Keywords: [ Deep Learning and representational learning ] [ Efficient training ] [ data selection ]


Abstract:

The great success of deep learning is largely driven by training over-parameterized models on massive datasets. To avoid excessive computation, extracting and training only on the most informative subset is drawing increasing attention. Nevertheless, it is still an open question how to select such a subset on which the model trained generalizes on par with the full data. In this paper, we propose dynamic margin selection (DynaMS). DynaMS leverages the distance from candidate samples to the classification boundary to construct the subset, and the subset is dynamically updated during model training. We show that DynaMS converges with large probability, and for the first time show both in theory and practice that dynamically updating the subset can result in better generalization over previous works. To reduce the additional computation incurred by the selection, a light parameter sharing proxy (PSP) is designed. PSP is able to faithfully evaluate instances with respect to the current model, which is necessary for dynamic selection. Extensive analysis and experiments demonstrate the superiority of the proposed approach in data selection against many state-of-the-art counterparts on benchmark datasets.

Chat is not available.