A1335
Title: Sampling from density power divergence-based generalized posterior distribution via stochastic optimization
Authors: Naruki Sonobe - Tokyo University of Science (Japan) [presenting]
Tomotaka Momozaki - Tokyo University of Science (Japan)
Tomoyuki Nakagawa - Meisei University & RIKEN Center for Brain Science (Japan)
Abstract: Robust Bayesian inference using density power divergence (DPD) has emerged as a promising approach for handling outliers in statistical estimation. While the DPD-based posterior offers theoretical guarantees of robustness, its practical application to general parametric models is computationally challenging due to an analytically intractable integral term in its formulation. These challenges are specifically pronounced in high-dimensional settings, where traditional numerical integration methods are inadequate and computationally expensive. The aim is to propose a new approximate sampling methodology that addresses these limitations by integrating the loss-likelihood bootstrap with a stochastic gradient descent algorithm specifically designed for DPD-based estimation. The approach enables efficient and scalable sampling from DPD-based posteriors for a broad class of parametric models, including those with intractable integrals. It is further extended to accommodate generalized linear models. Through simulations, it is demonstrated that the method generates samples that accurately approximate the target posterior while offering superior computational scalability. The results confirm that this framework provides a practical and efficient tool for applying robust Bayesian inference to complex, high-dimensional data.