EcoSta 2019: Start Registration
View Submission - EcoSta2019
A0812
Title: Information criteria for gradient boosted trees: Adaptive tree size and early stopping Authors:  Berent Aanund Stroemnes Lunde - University of Stavanger (Norway) [presenting]
Tore Selland Kleppe - University of Stavanger (Norway)
Hans Skaug - University of Bergen (Norway)
Abstract: In gradient tree boosting, the functional form of the ensemble repeatedly changes during training. To select a sensible functional complexity for the boosting ensemble, the leading implementations offer a high number of hyperparameters for regularization, available for manual tuning. This tuning typically require a combination of computationally costly cross validation, coupled with some expert knowledge. To combat this, we propose an information criterion for gradient boosted trees, applicable to both the learning of the structure of trees, and as a stopping criterion for the boosting algorithm. The resulting algorithm is adaptive to the training data at hand; it is largely automatic and with little worries of overfitting. Moreover, the computations for the criterion require little additional computational overhead, and, as the algorithm only has to run once, the computational cost is drastically reduced in comparison to implementations with manual tuning.