CMStatistics 2023: Start Registration
View Submission - CMStatistics
B1768
Title: When does Metropolized Hamiltonian Monte Carlo provably outperform Metropolis-adjusted Langevin algorithm? Authors:  Yuansi Chen - Duke University (United States) [presenting]
Abstract: The mixing time of Metropolized Hamiltonian Monte Carlo (HMC) is analysed with the leapfrog integrator to sample from a distribution whose log density is smooth, has Lipschitz Hessian in Frobenius norm and satisfies isoperimetry. The gradient complexity is bound to reach epsilon error in total variation distance from a warm start by $O(d^{1/4}polylog(1/epsilon))$ and demonstrate the benefit of choosing the number of leapfrog steps to be larger than 1. To surpass the previous analysis on the Metropolis-adjusted Langevin algorithm (MALA) that has $O(d^{1/2}polylog(1/epsilon))$ dimension dependency, a key feature is revealed in the proof that the joint distribution of the location and velocity variables of the discretization of the continuous HMC dynamics stays approximately invariant. This key feature, when shown via induction over the number of leapfrog steps, enables the obtainment of estimates on moments of various quantities that appear in the acceptance rate control of Metropolized HMC. To illustrate the applicability of the result, several examples of natural functions that fall into the framework are discussed.