Skip to yearly menu bar Skip to main content


Spotlight Poster

Dictionary Contrastive Learning for Efficient Local Supervision without Auxiliary Networks

Suhwan Choi · Myeongho Jeon · Yeonjung Hwang · Jeonglyul Oh · Sungjun Lim · Joonseok Lee · Myungjoo Kang

Halle B #58
[ ]
Thu 9 May 1:45 a.m. PDT — 3:45 a.m. PDT

Abstract:

While backpropagation (BP) has achieved widespread success in deep learning, itfaces two prominent challenges: computational inefficiency and biological implausibility.In response to these challenges, local supervision, encompassing LocalLearning (LL) and Forward Learning (FL), has emerged as a promising researchdirection. LL employs module-wise BP to achieve competitive results yet relies onmodule-wise auxiliary networks, which increase memory and parameter demands.Conversely, FL updates layer weights without BP and auxiliary networks but fallsshort of BP’s performance. This paper proposes a simple yet effective objectivewithin a contrastive learning framework for local supervision without auxiliarynetworks. Given the insight that the existing contrastive learning framework forlocal supervision is susceptible to task-irrelevant information without auxiliarynetworks, we present DICTIONARY CONTRASTIVE LEARNING (DCL) that optimizesthe similarity between local features and label embeddings. Our methodusing static label embeddings yields substantial performance improvements in theFL scenario, outperforming state-of-the-art FL approaches. Moreover, our methodusing adaptive label embeddings closely approaches the performance achieved byLL while achieving superior memory and parameter efficiency.

Live content is unavailable. Log in and register to view live content