EcoSta 2023: Start Registration
View Submission - EcoSta2023
A1002
Title: Attention is not not not explanation: With a focus on insurance ratemaking Authors:  Kyungbae Park - Kangwon National University (Korea, South) [presenting]
Jae Youn Ahn - Ewha Womans University (Korea, South)
Rosy Oh - Korea Military Academy (Korea, South)
Yang Lu - Concordia university (Austria)
Dan Zhu - Monash University (Australia)
Abstract: Attention mechanisms became a standard tool in NLP systems. In the classical attention mechanisms, attention weights served as the weight of input units; hence it is claimed that attention mechanisms provide interpretability. However, simulation studies in the series of recent studies arguably show that the explainability of the attention mechanism is questionable. The interpretability of the attention mechanism under the setting of insurance rate-making is investigated. Specifically, first, a mathematical argument is provided showing that the attention mechanism fails to provide explainability. Then, an alternative attention mechanism is provided where the explainability of the attention layer is guaranteed. A simulation study is accompanied to show the performance of the proposed method.