Title: Some new developments in nonparametric Bayesian inference
Authors: Mahmoud Zarepour - University of Ottawa (Canada) [presenting]
Abstract: The Ferguson Dirichlet process introduces a prior on space of all probability measures. This prior is a random discrete probability measure which is dense over the space of all probability measures. The Dirichlet prior works like frequentists empirical process and this can be confirmed through asymptotic theory. For example, the Bayesian bootstrap shows asymptotic equivalence and an analogous $m$ out of $n$ Bayesian bootstrap can also be introduced which works like the frequentists regular $m$ out of $n$ Bayesian bootstrap. The goal is to present an overview of topics from nonparametric Bayesian inference which has an equivalent development in frequentist's paradigm. Some other important priors that work like Dirichlet process but with more flexibility will be introduced and a brief historical overview will be provided. Data augmentation and its applications to Machine Learning. We also provide some extensions to derive priors coming from many other general infinitely divisible processes (both uni-variate and multivariate).