Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Deep Generative Model in Machine Learning: Theory, Principle and Efficacy

Path Planning for Masked Diffusion Models with Applications to Biological Sequence Generation

Zhangzhi Peng · Zachary Bezemek · Sawan Patel · Jarrid Rector-Brooks · Sherwood Yao · Alexander Tong · Pranam Chatterjee

Keywords: [ discrete diffusion models ] [ RNA Generation ] [ Protein Generation ] [ diffusion language models ]


Abstract:

In this paper, we investigate how the order in which tokens are unmasked during masked diffusion model (MDM) inference affects generative quality. We derive an expanded evidence lower bound (ELBO) that introduces a planner, responsible for selecting which tokens to unmask at each step. Our analysis suggests that alternative unmasking strategies can improve generative performance. Based on these insights, we propose Path Planning (P2), a training-free inference framework that leverages pre-trained BERT or the denoiser itself to guide unmasking decisions. P2 generalizes all known MDM sampling strategies and enables significant improvements across diverse domains including language generation (in-context learning, code generation, story infilling, mathematical reasoning, reverse curse correction) and biological sequence generation (protein and RNA sequences).

Chat is not available.