A0500
Title: Robustness and outlier detection of Bayesian model residuals with mixtures of normal, heavy-tailed and skewed components
Authors: Alexandra Posekany - University of Technology Vienna (Austria) [presenting]
Abstract: Outliers and skewed or heavy-tailed data frequently occur in data analytical problems in many fields. We consider notions of Bayesian robustness for various model types and compare them against classical robust estimators. Three aspects form the basis of Bayesian robustness: prior, likelihood and loss robustness. Generally, a considerate Bayesian analysis considers prior robustness through a sensitivity analysis, varying hyper-parameters and checking their influence. Loss functions connect with classical notions of robustness, e.g. reporting the posterior's median rather than mean. Yet, they are often disregarded, as estimating means is the basis of Monte Carlo simulation. The final notion of Bayesian robustness is robustifying the likelihood. Constructing normally distributed likelihood models is often due to computational convenience. We wish to provide a robust estimation of parameters of the main part of the data through a normal or skewed distribution as likelihood, while simultaneously identifying the outlying part of the data represented by one or more skewed or heavy-tailed mixture components. Through the component labels and posterior weights, we can identify the noisy or outlying parts of the data for filtering or inspecting the data quality.