CFE-CMStatistics 2025: Start Registration
View Submission - CFE-CMStatistics 2025
A0291
Title: Bayesian empirical likelihood for multitask-learning: Computations and theory Authors:  Weichang Yu - University of Melbourne (Australia) [presenting]
Abstract: A novel Bayesian semiparametric multitask learning method is proposed. Task-relatedness is modeled using a variety of combinations of hierarchical setup and shrinkage priors to accommodate different plausible relatedness structures. To avoid misspecification of the data-generating process, Bayesian empirical likelihoods are utilized to infer both local and shared parameters. The Bayesian empirical likelihood posterior support is typically non-convex and is consequently challenging to sample from. This computational challenge is amplified in multitask settings where the likelihood arises from a combination of multiple datasets, further complicating the posterior support structure. To address this, a sequential Monte Carlo algorithm is developed that involves draws from a sequence of Bayesian posterior-adjusted empirical likelihood posterior with decreasing adjustment levels. This adjustment level sequence mitigates irregularities in intermediate steps, allowing particles to explore the posterior space efficiently. It is shown that the intermediate adjusted posterior is continuous in the tempering parameter and converges to the target original posterior as the adjustment level approaches zero, thus guaranteeing algorithmic convergence. The efficiency and convergence of the method are demonstrated through a simulation study and empirical applications.