EcoSta 2023: Start Registration
View Submission - EcoSta2023
A0310
Title: Analytic natural gradient updates for Cholesky factor in Gaussian variational approximation Authors:  Siew Li Linda Tan - National University of Singapore (Singapore) [presenting]
Abstract: Stochastic gradient methods have enabled variational inference for high-dimensional models. However, the steepest ascent direction in the parameter space of a statistical model is actually given by the natural gradient, which premultiplies the widely used Euclidean gradient by the inverse Fisher information. The use of natural gradients can improve convergence, but inverting the Fisher information matrix is daunting in high dimensions. In Gaussian variational approximation, natural gradient updates of the mean and precision of the normal distribution can be derived analytically but do not ensure that the precision matrix remains positively definite. To tackle this issue, the Cholesky decomposition of the covariance or precision matrix is considered, and analytic natural gradient updates of the Cholesky factor are derived, which depend on either the first or second derivative of the log posterior density. Efficient natural gradient updates of the Cholesky factor are also derived under sparsity constraints representing different posterior correlation structures. As Adam's adaptive learning rate does not work well with natural gradients, stochastic normalized natural gradient ascent is proposed with momentum. The efficiency of the proposed methods is demonstrated using logistic regression and generalized linear mixed models.