Skip to yearly menu bar Skip to main content


In-Person Poster presentation / poster accept

A Self-Attention Ansatz for Ab-initio Quantum Chemistry

Ingrid von Glehn · James Spencer · David Pfau

MH1-2-3-4 #110

Keywords: [ Machine Learning for Sciences ] [ attention ] [ mcmc ] [ chemistry ] [ transformers ] [ monte carlo ] [ Machine learning for science ] [ machine learning for chemistry ] [ machine learning for molecules ] [ self-generative learning ] [ quantum physics ] [ machine learning for physics ]


Abstract:

We present a novel neural network architecture using self-attention, the Wavefunction Transformer (PsiFormer), which can be used as an approximation (or "Ansatz") for solving the many-electron Schrödinger equation, the fundamental equation for quantum chemistry and material science. This equation can be solved from first principles, requiring no external training data. In recent years, deep neural networks like the FermiNet and PauliNet have been used to significantly improve the accuracy of these first-principle calculations, but they lack an attention-like mechanism for gating interactions between electrons. Here we show that the PsiFormer can be used as a drop-in replacement for these other neural networks, often dramatically improving the accuracy of the calculations. On larger molecules especially, the ground state energy can be improved by dozens of kcal/mol, a qualitative leap over previous methods. This demonstrates that self-attention networks can learn complex quantum mechanical correlations between electrons, and are a promising route to reaching unprecedented accuracy in chemical calculations on larger systems.

Chat is not available.