EcoSta 2024: Start Registration
View Submission - EcoSta2024
A0349
Title: Natural gradient hybrid variational inference with application to deep mixed models Authors:  Weiben Zhang - University of Melbourne (Australia) [presenting]
Michael Stanley Smith - Melbourne Business School (Australia)
Worapree Ole Maneesoonthorn - Monash University (Australia)
Ruben Loaiza-Maya - Monash University (Australia)
Abstract: Stochastic models with global parameters theta and latent variables z are common, and variational inference (VI) is popular for their estimation. A variational approximation (VA) is used that comprises a Gaussian with factor covariance matrix for the marginal of theta and the exact conditional posterior of $z|\theta$. Stochastic optimization for learning the VA requires the generation of z from its conditional posterior, while theta is updated using the natural gradient, producing a hybrid VI method. It is shown that this is a well-defined natural gradient optimization algorithm for the joint posterior of (z, theta). Fast-to-compute expressions for the Tikhonov-damped Fisher information matrix required to compute a stable natural gradient update are derived. The approach is used to estimate probabilistic Bayesian neural networks with random output layer coefficients to allow for heterogeneity. Simulations show that using the natural gradient is more efficient than using the ordinary gradient and that the approach is faster and more accurate than the two leading benchmark natural gradient VI methods. In a financial application, the accounting for industry-level heterogeneity is shown using the deep model, which improves the accuracy of probabilistic prediction of asset pricing models.