Skip to yearly menu bar Skip to main content


Poster

Learning Energy Decompositions for Partial Inference in GFlowNets

Hyosoon Jang · Minsu Kim · Sungsoo Ahn

Halle B #77
[ ]
Wed 8 May 1:45 a.m. PDT — 3:45 a.m. PDT
 
Oral presentation: Oral 3B
Wed 8 May 1 a.m. PDT — 1:45 a.m. PDT

Abstract:

This paper studies generative flow networks (GFlowNets) to sample objects from the Boltzmann energy distribution via a sequence of actions. In particular, we focus on improving GFlowNet with partial inference: training flow functions with the evaluation of the intermediate states or transitions. To this end, the recently developed forward-looking GFlowNet reparameterizes the flow functions based on evaluating the energy of intermediate states. However, such an evaluation of intermediate energies may (i) be too expensive or impossible to evaluate and (ii) even provide misleading training signals under large energy fluctuations along the sequence of actions. To resolve this issue, we propose learning energy decompositions for GFlowNets (LED-GFN). Our main idea is to (i) decompose the energy of an object into learnable potential functions defined on state transitions and (ii) reparameterize the flow functions using the potential functions. In particular, to produce informative local credits, we propose to regularize the potential to change smoothly over the sequence of actions. It is also noteworthy that training GFlowNet with our learned potential can preserve the optimal policy. We empirically verify the superiority of LED-GFN in five problems including the generation of unstructured and maximum independent sets, molecular graphs, and RNA sequences.

Live content is unavailable. Log in and register to view live content