EcoSta 2023: Start Registration
View Submission - EcoSta2023
A1068
Title: Wasserstein convergence in Bayesian deconvolution models Authors:  Catia Scricciolo - University of Verona (Italy) [presenting]
Judith Rousseau - University of Oxford (United Kingdom)
Abstract: The purpose is to investigate the multivariate deconvolution problem of recovering the distribution of a signal from i.i.d. observations additively contaminated with random errors having known distribution. We investigate whether a Bayesian nonparametric approach for modelling the latent distribution of the signal can yield inferences with asymptotic frequentist validity under the $L^1$-Wasserstein metric. For errors with independent coordinates having ordinary smooth densities, we derive an inversion inequality relating the $L^1$-Wasserstein distance between the distributions of the signal to the $L^1$-distance between the corresponding mixture densities of the observations. This inequality leads to minimax-optimal rates of contraction for the posterior measure on the distribution of the signal. As an illustration, we consider a Dirichlet process mixture-of-normals prior on the mixing distribution and a Laplace noise. We construct an adaptive approximation of the sampling density by convolving the Laplace density with a well-chosen mixture of normal densities and show that the posterior measure concentrates around the true density at minimax rate, up to a logarithmic factor, in the $L^1$-metric. The same posterior law automatically adapts to Sobolev regularity of the mixing density, leading to a new Bayesian adaptive estimation procedure for mixing distributions with regular densities, under the $L^1$-Wasserstein metric.