Skip to yearly menu bar Skip to main content


Poster

Memory Mosaics

Jianyu Zhang · Niklas Nolte · Ranajoy Sadhukhan · Beidi Chen · Leon Bottou

[ ] [ Project Page ]
2025 Poster

Abstract:

Memory Mosaics are networks of associative memories working in concert to achieve a prediction task of interest. Like transformers, memory mosaics possess compositional capabilities and in-context learning capabilities. Unlike transformers, memory mosaics achieve these capabilities in comparatively transparent way (“predictive disentanglement”). We illustrate these capabilities on a toy example and also show that memory mosaics perform as well or better than transformers on medium-scale language modeling tasks.

Chat is not available.