Skip to yearly menu bar Skip to main content


Poster

The Role of Permutation Invariance in Linear Mode Connectivity of Neural Networks

Rahim Entezari · Hanie Sedghi · Olga Saukh · Behnam Neyshabur

Virtual

Keywords: [ loss landscape ] [ deep learning ] [ permutation ] [ invariance ] [ mode connectivity ]


Abstract:

In this paper, we conjecture that if the permutation invariance of neural networks is taken into account, SGD solutions will likely have no barrier in the linear interpolation between them. Although it is a bold conjecture, we show how extensive empirical attempts fall short of refuting it. We further provide a preliminary theoretical result to support our conjecture. Our conjecture has implications for the lottery ticket hypothesis, distributed training, and ensemble methods. The source code is available at \url{https://github.com/rahimentezari/PermutationInvariance}.

Chat is not available.