CMStatistics 2022: Start Registration
View Submission - CMStatistics
B1441
Title: Statistical modeling within the generalized Bayes paradigm Authors:  Tommaso Rigon - University of Milano-Bicocca (Italy) [presenting]
Amy Herring - (United States)
David Dunson - Duke University (United States)
Abstract: Loss-based clustering methods, such as $k-$means and its variants, are standard tools for finding groups in data. However, the lack of quantification of uncertainty in the estimated clusters is a disadvantage. Model-based clustering based on mixture models provides an alternative, but such methods face computational problems and large sensitivity to the choice of kernel. A generalized Bayes framework is proposed that bridges these paradigms through the use of Gibbs posteriors. In conducting Bayesian updating, the log-likelihood is replaced by a loss function for clustering, leading to a rich family of clustering methods. The Gibbs posterior represents a coherent updating of Bayesian beliefs without needing to specify a likelihood for the data, and can be used for characterizing uncertainty in clustering. We consider losses based on Bregman divergence and pairwise similarities, and develop efficient deterministic algorithms for point estimation along with sampling algorithms for uncertainty quantification. Several existing clustering algorithms, including k-means, can be interpreted as generalized Bayes estimators under our framework, and hence we provide a method of uncertainty quantification for these approaches; for example, allowing calculation of the probability a data point is well clustered.