CFE 2015: Start Registration
View Submission - CFE
A1468
Topic: Contributions on evaluation of forecasting Title: A fast model confidence set implementation for large and growing collections of models Authors:  Sylvain Barde - University of Kent (United Kingdom) [presenting]
Abstract: A new algorithm is proposed for finding the confidence set of a collection of forecasts or prediction models. Existing numerical implementations for finding the confidence set use an elimination approach where one starts with the full collection of models and successively eliminates the worst performing until the null of equal predictive ability is no longer rejected at a given confidence level. The intuition behind the proposed implementation lies in reversing the process: one starts with a collection of two models and as models are successively added to the collection both the model rankings and $p-$values are updated. The first benefit of this updating approach is a reduction of one polynomial order in both the time complexity and memory cost of finding the confidence set of a collection of $M$ models, falling respectively from $\mathcal{O}\left({M^3}\right)$ to $\mathcal{O}\left({M^2}\right)$ and from $\mathcal{O}\left({M^2}\right)$ to $\mathcal{O}\left({M}\right)$. This theoretical prediction is confirmed by a Monte Carlo benchmarking analysis of the algorithms. The second key benefit of the updating approach is that it intuitively allows for further models to be added at a later point in time, thus enabling collaborative efforts using the model confidence set procedure.