Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Frontiers in Probabilistic Inference: learning meets Sampling

Posterior Inference with Diffusion Models for High-dimensional Black-box Optimization

Taeyoung Yun · Kiyoung Om · Jaewoo Lee · Sujin Yun · Jinkyoo Park


Abstract:

Optimizing high-dimensional and complex black-box functions is crucial in numerous scientific applications.While Bayesian optimization (BO) is a powerful method for sample-efficient optimization, it struggles with the curse of dimensionality and scaling to thousands of evaluations. Recently, leveraging generative models to solve black-box optimization problems has emerged as a promising framework.However, those methods often underperform compared to BO methods due to limited expressivity and difficulty of uncertainty estimation in high-dimensional spaces.To overcome these issues, we introduce \textbf{DiBO}, a novel framework for solving high-dimensional black-box optimization problems.Our method iterates two stages. First, we train a diffusion model to capture the data distribution and an ensemble of proxies to predict function values with uncertainty quantification.Second, we cast the candidate selection as a posterior inference problem to balance exploration and exploitation in high-dimensional spaces. Concretely, we fine-tune diffusion models to amortize posterior inference.Extensive experiments demonstrate that our method outperforms state-of-the-art baselines across various synthetic and real-world black-box optimization tasks. Our code is publicly available \href{https://anonymous.4open.science/r/DiBO-E486}{here}.

Chat is not available.