CFE-CMStatistics 2025: Start Registration
View Submission - CFE-CMStatistics 2025
A1242
Title: Proximal Hamiltonian Monte Carlo Authors:  Eric Chi - University of Minnesota (United States) [presenting]
Dootika Vats - Indian Institute of Technology, Kanpur (India)
Apratim Shukla - Indian Institute of Technology Kanpur (India)
Abstract: Modern statistical learning problems often face challenges in efficient sampling. This is due to a dearth of effective sampling strategies, particularly for high dimensions. In addition, this is even more pronounced for problems of image denoising and sparse signal recovery, etc. A major issue is the non-differentiability of the underlying Bayesian posterior density. This is a direct consequence of employing a non-differentiable prior, which is popular to induce sparsity in the model. As a result, sampling even from efficient gradient-based Markov chain Monte Carlo (MCMC) methods becomes difficult. This problem is circumvented by proposing a proximal Hamiltonian Monte Carlo (p-HMC) algorithm, which uses tools like proximal mappings and Moreau-Yosida (MY) envelopes within Hamiltonian dynamics. The contribution is that conditions for geometric ergodicity of the underlying HMC chain and a methodology to obtain a suitable choice for the regularization parameter in the MY envelope are also provided. The method has been implemented for a sparse logistic regression and a low-rank matrix estimation problem, which demonstrates its efficiency over the current state of the art.