CMStatistics 2023: Start Registration
View Submission - CFE
A1035
Title: Uncertainty quantification in forecast comparisons Authors:  Marc-Oliver Pohle - Heidelberg Institute for Theoretical Studies (Germany) [presenting]
Tanja Zahn - Goethe University Frankfurt (Germany)
Abstract: Comparing competing forecasting methods via expected scores is the cornerstone of forecast evaluation. Skill scores or relative expected scores enhance interpretability in that they indicate the relative improvement of a forecasting method over a competitor. At the moment statistical inference in forecast comparisons is usually restricted to forecast accuracy tests for single forecast horizons, single variables and single locations. Simultaneous confidence bands for skill scores (as well as relative expected scores and score differences) are introduced to quantify sampling uncertainty in forecast comparisons. The confidence bands are a simple tool to characterize and represent sampling uncertainty graphically. Further, they can be used for a single variable over multiple forecast horizons or multiple locations or for multiple variables and as such avoid multiple comparison problems. They are applicable for any type of forecast, from mean over quantile to distributional forecasts, and are implemented via a moving block bootstrap. The validity of the bands is ensured by an assumption that is akin to the classical Diebold-Mariano assumption for forecast accuracy tests. The methodology is illustrated in applications to economic and meteorological forecasts, also reinforcing the perils of ignoring sampling uncertainty and the usual multi-horizon, multi-location or multi-variable nature of forecast evaluation.