Skip to yearly menu bar Skip to main content


In-Person Poster presentation / poster accept

Spatial Attention Kinetic Networks with E(n)-Equivariance

Yuanqing Wang · John Chodera

MH1-2-3-4 #20

Keywords: [ Deep Learning and representational learning ]


Abstract: Neural networks that are equivariant to rotations, translations, reflections, and permutations on $n$-dimensional geometric space have shown promise in physical modeling for tasks such as accurately but inexpensively modeling complex potential energy surfaces to guiding the sampling of complex dynamical systems or forecasting their time evolution.Current state-of-the-art methods employ spherical harmonics to encode higher-order interactions among particles, which are computationally expensive.In this paper, we propose a simple alternative functional form that uses neurally parametrized linear combinations of edge vectors to achieve equivariance while still universally approximating node environments.Incorporating this insight, we design \emph{spatial attention kinetic networks} with E(n)-equivariance, or SAKE, which are competitive in many-body system modeling tasks while being significantly faster.

Chat is not available.