Paper of the Month: March 2023

Once a month during the academic year, the statistics faculty select a paper for our students to read and discuss. Papers are selected based on their impact or historical value, or because they contain useful techniques or results.


David M. Blei, Alp Kucukelbir & Jon D. McAuliffe (2017) Variational Inference: A Review for Statisticians, Journal of the American Statistical Association, 112:518, 859-877.

Friday, March 3, 12:00 pm
AUST 326

Notes Preparer: Min Lin, PhD Student

In Bayesian analysis, the posterior density is typically intractable. The Markov chain Monte Carlo
(MCMC) is routinely applied to make inference. In his review paper (Blei et al. 2017), David Blei discusses
variational inference (VI) as an optimization-based alternative to MCMC that is faster and easier to scale
to large datasets. When the task is to fit an appropriate model for prediction in a complex large data,
variational inference is a valuable tool in the statistician’s toolbox before jumping directly to MCMC.
Moreover, VI opens the door to modern probabilistic machine learning methods such as the variational
auto-encoder (Kingma and Welling 2022).

In this discussion, we will have a gentle introduction to mean-field variational inference with KL
divergence, emphasizing its properties on exponential families. We will introduce the coordinate ascent
mean-field variational inference (CAVI) and the stochastic variational inference (SVI) algorithms, the
former of which is closely related to Gibbs sampling. The algorithms will be illustrated on a data
simulated from a Gaussian mixture model. We will also discuss the phenomenon that VI generally
underestimates the posterior variance.

In the recent volume (Volume 117, 2022) of JASA, the most read paper is also on variation inference.
Ray and Szabó (2022) study a mean-field spike and slab variational approximation to Bayesian model
selection priors in sparse high-dimensional linear regression.

Reference: