Skip to yearly menu bar Skip to main content


In-Person Poster presentation / poster accept

Sampling-free Inference for Ab-Initio Potential Energy Surface Networks

Nicholas Gao · Stephan Günnemann

MH1-2-3-4 #111

Keywords: [ Machine Learning for Sciences ] [ graph neural networks ] [ computational physics ] [ self-supervised learning ] [ molecules ] [ Machine learning for science ] [ online learning ] [ self-generative learning ]


Abstract:

Recently, it has been shown that neural networks not only approximate the ground-state wave functions of a single molecular system well but can also generalize to multiple geometries. While such generalization significantly speeds up training, each energy evaluation still requires Monte Carlo integration which limits the evaluation to a few geometries. In this work, we address the inference shortcomings by proposing the Potential learning from ab-initio Networks (PlaNet) framework, in which we simultaneously train a surrogate model in addition to the neural wave function. At inference time, the surrogate avoids expensive Monte-Carlo integration by directly estimating the energy, accelerating the process from hours to milliseconds. In this way, we can accurately model high-resolution multi-dimensional energy surfaces for larger systems that previously were unobtainable via neural wave functions. Finally, we explore an additional inductive bias by introducing physically-motivated restricted neural wave function models. We implement such a function with several additional improvements in the new PESNet++ model. In our experimental evaluation, PlaNet accelerates inference by 7 orders of magnitude for larger molecules like ethanol while preserving accuracy. Compared to previous energy surface networks, PESNet++ reduces energy errors by up to 74%.

Chat is not available.