CFE-CMStatistics 2024: Start Registration
View Submission - CFECMStatistics2024
A0835
Title: A framework for analyzing the cost-benefit tradeoff of training neural networks Authors:  Simon Spavound - LeBow College of Business Drexel University (United States) [presenting]
Nikolaos Kourentzes - University of Skovde (Sweden)
Abstract: Recent computational advances, along with interest in applications such as large language models, have led to the deployment of larger and larger deep neural networks. Irrespective of network size, their training is complex and stochastic and affects the quality of the model outputs. Achieving reliable results may require multiple training runs, which substantially increases computational costs. Moreover, large models can be computationally intensive even after training. One socially disadvantageous side effect is the cost, both monetary and environmental, of the training of such models. Little practical advice is available, which demonstrates the tradeoff between training models multiple times for increased accuracy or reliability vs the computational cost of such training. This is illustrated on a time series forecasting experimental setup to provide some guidance in this area and open up future avenues of exploration. The tradeoffs are demonstrated for local as well as global models and between them.