Skip to yearly menu bar Skip to main content


In-Person Poster presentation / poster accept

Simplicial Hopfield networks

Thomas F Burns · Tomoki Fukai

MH1-2-3-4 #74

Keywords: [ Neuroscience and Cognitive Science ] [ memory capacity ] [ attention ] [ Hopfield network ] [ Topology ] [ Associative Memory ] [ computational neuroscience ] [ Simplicial Complex ]


Abstract: Hopfield networks are artificial neural networks which store memory patterns on the states of their neurons by choosing recurrent connection weights and update rules such that the energy landscape of the network forms attractors around the memories. How many stable, sufficiently-attracting memory patterns can we store in such a network using $N$ neurons? The answer depends on the choice of weights and update rule. Inspired by setwise connectivity in biology, we extend Hopfield networks by adding setwise connections and embedding these connections in a simplicial complex. Simplicial complexes are higher dimensional analogues of graphs which naturally represent collections of pairwise and setwise relationships. We show that our simplicial Hopfield networks increase memory storage capacity. Surprisingly, even when connections are limited to a small random subset of equivalent size to an all-pairwise network, our networks still outperform their pairwise counterparts. Such scenarios include non-trivial simplicial topology. We also test analogous modern continuous Hopfield networks, offering a potentially promising avenue for improving the attention mechanism in Transformer models.

Chat is not available.