A1231
Title: Assessing the conditional calibration of interval forecasts using decompositions of the interval score
Authors: Sam Allen - Karlsruhe Institute of Technology (Germany) [presenting]
Julia Burnello - ETH Zurich (Switzerland)
Johanna Ziegel - ETH Zurich (Switzerland)
Abstract: Forecasts for uncertain future events should be probabilistic. Probabilistic forecasts are commonly issued as prediction intervals, which provide a measure of uncertainty in the unknown outcome whilst being easier to understand and communicate than full predictive distributions. The calibration of a prediction interval can be assessed by checking whether the probability that the outcome falls within the interval is equal to the nominal coverage level. However, such coverage checks are typically unconditional and therefore relatively weak. Although this is well known, there is a lack of methods to assess the conditional calibration of interval forecasts. The purpose is to demonstrate how this can be achieved via decompositions of the well-known interval (or Winkler) score. Notions of calibration are studied for interval forecasts, and a decomposition of the interval score is then introduced based on isotonic distributional regression. This decomposition exhibits many desirable properties, both in theory and in practice, which allow users to accurately assess the conditional calibration of interval forecasts. This is illustrated on simulated data and in three applications to benchmark regression datasets.