COMPSTAT 2024: Start Registration
View Submission - COMPSTAT2024
A0153
Title: Analysing the impact of removing infrequent terms on topic quality in LDA models Authors:  Viktoriia Naboka-Krell - Justus Liebig Unversity of Giessen (Germany) [presenting]
Peter Winker - University of Giessen (Germany)
Victor Bystrov - University of Lodz (Poland)
Anna Staszewska-Bystrova - University of Lodz (Poland)
Abstract: An initial procedure in text-as-data applications is text preprocessing. One of the typical steps, which can substantially facilitate computations, consists of removing infrequent words believed to provide limited information about the corpus. Despite the popularity of vocabulary pruning, not many guidelines on how to implement it are available in the literature. The aim is to fill this gap by examining the effects of removing infrequent words for the quality of topics estimated using Latent Dirichlet Allocation. The analysis is based on Monte Carlo experiments taking into account different criteria for infrequent terms removal and various evaluation metrics. The results indicate that pruning is beneficial and that the share of vocabulary which might be eliminated can be quite considerable.