Title: Soft tensor regression
Authors: Georgia Papadogeorgou - Duke University (United States) [presenting]
Zhengwu Zhang - University of Rochester (United States)
David Dunson - Duke University (United States)
Abstract: Statistical methods relating tensor predictors to scalar outcomes in a regression model generally vectorize the tensor predictor and estimate the coefficients of its entries employing some form of regularization, use summaries of the tensor covariate, or use a low dimensional approximation of the coefficient tensor. However, low rank approximations of the coefficient tensor can suffer if the true rank of the tensor is not small. We propose a tensor regression framework which assumes a soft version of the parallel factors (PARAFAC) approximation. In contrast to classic PARAFAC, where each entry of the coefficient tensor is the sum of products of row-specific contributions across the tensor modes, the soft tensor regression (Softer) framework allows the row-specific contributions to vary around an overall mean. A Bayesian approach to inference is followed, and it is shown that softening the PARAFAC increases model flexibility, leads to improved estimation of coefficient tensors, and more accurate predictions, even for a low approximation rank. In the context of the motivating application, Softer is adapted to symmetric and semi-symmetric tensor predictors and is used to analyze the relationship between brain network characteristics and human traits.