Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Physics for Machine Learning

Studying Phase Transitions in Contrastive Learning With Physics-Inspired Datasets

Ali Cy · Anugrah Chemparathy · Michael Han · Rumen R Dangovski · Peter Lu · Marin Soljacic


Abstract:

In recent years contrastive learning has become a state-of-the-art technique in representation learning, but the exact mechanisms by which it trains are not well understood. By focusing on physics-inspired datasets with low intrinsic dimensionality, we are able to visualize and study contrastive training procedures in better resolution. We empirically study the geometric development of contrastively learned embeddings, discovering phase transitions between locally metastable embedding conformations towards an optimal structure. Ultimately we show a strong experimental link between stronger augmentations and decreased training time for contrastively learning more geometrically meaningful representations.

Chat is not available.