Skip to yearly menu bar Skip to main content


Contributed Talk
in
Workshop: Neural Compression: From Information Theory to Applications

Oral 2: Yangjun Ruan et al., Improving Lossless Compression Rates via Monte Carlo Bits-Back Coding

Yang Yang


Abstract:

Latent variable models have been successfully applied in lossless compression with the bits-back coding algorithm. However, bits-back suffers from an increase in the bitrate equal to the KL divergence between the approximate posterior and the true posterior. In this paper, we show how to remove this gap asymptotically by deriving bits-back schemes from tighter variational bounds. The key idea is to exploit extended space representations of Monte Carlo estimators of the marginal likelihood. Naively applied, our schemes would require more initial bits than the standard bits-back coder, but we show how to drastically reduce this additional cost with couplings in the latent space. We demonstrate improved lossless compression rates in a variety of settings.