Skip to yearly menu bar Skip to main content


Poster

How connectivity structure shapes rich and lazy learning in neural circuits

Yuhan Helena Liu · Aristide Baratin · Jonathan Cornford · Stefan Mihalas · Eric SheaBrown · Guillaume Lajoie

Halle B #60
[ ]
Wed 8 May 7:30 a.m. PDT — 9:30 a.m. PDT

Abstract:

In theoretical neuroscience, recent work leverages deep learning tools to explore how some network attributes critically influence its learning dynamics. Notably, initial weight distributions with small (resp. large) variance may yield a rich (resp. lazy) regime, where significant (resp. minor) changes to network states and representation are observed over the course of learning. However, in biology, neural circuit connectivity generally has a low-rank structure and therefore differs markedly from the random initializations generally used for these studies. As such, here we investigate how the structure of the initial weights — in particular their effective rank — influences the network learning regime. Through both empirical and theoretical analyses, we discover that high-rank initializations typically yield smaller network changes indicative of lazier learning, a finding we also confirm with experimentally-driven initial connectivity in recurrent neural networks. Conversely, low-rank initialization biases learning towards richer learning. Importantly, however, as an exception to this rule, we find lazier learning can still occur with a low-rank initialization that aligns with task and data statistics. Our research highlights the pivotal role of initial weight structures in shaping learning regimes, with implications for metabolic costs of plasticity and risks of catastrophic forgetting.

Live content is unavailable. Log in and register to view live content