Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Physics for Machine Learning

Self-Supervised Learning with Lie Symmetries for Partial Differential Equations

GrĂ©goire Mialon · Quentin Garrido · Hannah Lawrence · Danyal Rehman · Yann LeCun · Bobak Kiani


Abstract:

Machine learning for differential equations paves the way for computationally efficient alternatives to numerical solvers, with potentially broad impacts in science and engineering. Though current algorithms typically require simulated training data tailored to a given setting, one may instead wish to learn useful information from heterogeneous sources, or from real dynamical systems observations that are messy or incomplete. In this work, we learn general-purpose representations of PDEs from heterogeneous data by implementing joint embedding methods for self-supervised learning (SSL), a framework for unsupervised representation learning that has had notable success in computer vision. Our representation outperforms baseline approaches to invariant tasks, such as regressing the coefficients of a PDE, while also improving the time-stepping performance of neural solvers. Data augmentation is central to SSL: although simple augmentation strategies such as cropping provide satisfactory results, our inclusion of transformations corresponding to the symmetry group of a given PDE significantly improves the quality of the learned representations.

Chat is not available.