COMPSTAT 2023: Start Registration
View Submission - COMPSTAT2023
A0408
Title: Adaptive latent feature sharing for piecewise lineardimensionality reduction Authors:  Yordan Raykov - University of Nottingham (United Kingdom) [presenting]
Abstract: Linear Gaussian exploratory tools like PCA and FA are widely used for data analysis and visualization. However, their limitations in high-dimensional problems have led to the development of more robust and flexible models. Discrete-continuous latent feature models offer a solution by inferring the likelihood of shared features among data points. We propose a new approach based on two-parameter discrete distribution models that decouple feature sparsity and dictionary size, capturing common and rare features effectively. This framework enables the development of adaptive variants of factor analysis (aFA) and probabilistic principal component analysis (aPPCA), allowing for flexible structure discovery and dimensionality reduction. We provide efficient inference methods using Gibbs sampling and expectation-maximization, converging much faster to accurate estimates. The effectiveness of aPPCA and aFA is demonstrated in feature learning, data visualization, and data whitening tasks. These models extract meaningful features from MNIST, COLI-20 images, and autoencoder features. Moreover, replacing PCA with aPPCA in functional magnetic resonance imaging (fMRI) analysis improves blind source separation of neural activity, offering more robust and localized results.