CFE 2019: Start Registration
View Submission - CMStatistics
B0721
Title: Universal boosting variational inference Authors:  Trevor Campbell - University of British Columbia (Canada) [presenting]
Abstract: Boosting variational inference (BVI) approximates a probability density by building up a mixture of simple component distributions one at a time, using techniques from sparse convex optimization to provide both computational scalability and error guarantees. But the guarantees have strong conditions that do not often hold in practice, resulting in degenerate component optimization problems; and the ad-hoc regularization used to prevent degeneracy in practice can cause BVI to fail in unintuitive ways. The purpose is to introduce universal boosting variational inference (UBVI), a BVI scheme that exploits the simple geometry of probability densities under the Hellinger metric to prevent the degeneracy of other gradient-based BVI methods and avoid difficult joint optimizations of both component and weight. We will develop a scalable implementation of UBVI and show that for any target density and any mixture component family, the output converges to the best possible approximation in the mixture family, even when the mixture family is misspecified. We will discuss statistical benefits of the Hellinger distance as a variational objective through bounds on posterior probability, moment, and importance sampling errors. Experimental results will be provided, demonstrating that UBVI provides reliable and accurate posterior approximations with little to no tuning effort.