A1439
Title: Statistical few-shot learning via parameter pooling
Authors: Andrew Simpson - South Dakota State University (United States)
Semhar Michael - South Dakota State University (United States) [presenting]
Abstract: The emergence of high-dimensional data characterized by the observations being partitioned into many classes with a limited number of samples per class brings a significant challenge to classical probabilistic machine-learning techniques. Few- or one-shot learning problems are among these classes of problems. Given the nature of the few-shot learning framework, strong assumptions about the data-generating process must be made. To get non-singular and stable estimates for the covariance matrices of each class, it is often assumed that each class has the same covariance matrix as in linear discriminant analysis (LDA). In this framework, given that the number of classes tends towards infinity in this setting, this assumption is extreme. The strong assumption of LDA is relaxed, and stable estimates of the covariance matrices are obtained. In this regard, a finite number of distributions is assumed to exist from which a class can come. In the case of Gaussian distributions, this amounts to assuming a finite number of covariance matrices a class can take, which is less than the number of classes. Using simulation studies and real data analysis will demonstrate the utility of the proposed methodology.