CMStatistics 2023: Start Registration
View Submission - CMStatistics
B0862
Title: Regularized empirical likelihood for Bayesian inference: Theory and applications Authors:  Eunseop Kim - Eli Lilly and Company (United States)
Steven MacEachern - The Ohio State University (United States)
Mario Peruggia - The Ohio State University (United States) [presenting]
Abstract: Bayesian inference with empirical likelihood faces a challenge as the posterior domain is a proper subset of the original parameter space due to the convex hull constraint. A regularized, exponentially tilted empirical likelihood is proposed to address this issue. The method removes the convex hull constraint using a novel regularization technique, incorporating a continuous exponential family distribution to satisfy a Kullback-Leibler divergence criterion. The regularization arises as a limiting procedure where pseudo-data is added to the formulation of an exponentially tilted empirical likelihood in a disciplined way. It is shown that this regularized exponentially tilted empirical likelihood retains certain desirable asymptotic properties of exponentially tilted empirical likelihood with improved finite sample performance. Simulations and data analysis demonstrate that the proposed method provides a suitable pseudo-likelihood for Bayesian inference.