Skip to yearly menu bar Skip to main content


Virtual presentation / poster accept

Approximate Bayesian Inference with Stein Functional Variational Gradient Descent

Tobias Pielok · Bernd Bischl · David RĂ¼gamer

Keywords: [ Probabilistic Methods ]


Abstract:

We propose a general-purpose variational algorithm that forms a natural analogue of Stein variational gradient descent (SVGD) in function space. While SVGD successively updates a set of particles to match a target density, the method introduced here of Stein functional variational gradient descent (SFVGD) updates a set of particle functions to match a target stochastic process (SP). The update step is found by minimizing the functional derivative of the Kullback-Leibler divergence between SPs. SFVGD can either be used to train Bayesian neural networks (BNNs) or for ensemble gradient boosting. We show the efficacy of training BNNs with SFVGD on various real-world datasets.

Chat is not available.