A0381
Title: Bayesian estimation of extreme value mixture models: Simplifications and enhancements
Authors: Daniela Laas - University of Saint Gallen (Switzerland) [presenting]
Abstract: Extreme value mixture models combine the generalized Pareto distribution for the tail approximation with a parametric, semiparametric, or nonparametric model for the body. The Bayesian estimation of these models has the advantage of deriving a full distribution of thresholds between the body and tail models, but the multimodality of the posterior distribution frequently leads to convergence or mixing problems in the Markov chain Monte Carlo simulation. In addition, the parameter dependence and a commonly used simplification in the estimation of the tail fraction may lead to inefficient sampling and imprecise estimation results. A comprehensive simulation study and empirical application to historical insurance losses show that the parallel tempering algorithm can substantially enhance the convergence behavior of the Markov chains and reduce the lag-50 autocorrelations by forty per cent or more. A reparameterisation of the generalized Pareto distribution to overcome the threshold dependence of the scale parameter does not seem to be necessary in a real world setting with limited sample sizes. Similar estimation results under the simplified sampling algorithm with maximum likelihood approximation of the tail fraction and a newly developed full Bayesian approach further support the use of the simplified method.