EcoSta 2023: Start Registration
View Submission - EcoSta2023
A0667
Title: Integration of fMRI and genomics data with interpretable multimodal deep learning Authors:  Yu-Ping Wang - Tulane University (United States) [presenting]
Abstract: Integrating multi-modal brain imaging and genomics has been widely used in brain studies for improved diagnosis of mental diseases. Meanwhile, it calls for novel data integration models to capture complex associations between multi-modal brain imaging and genomics and, furthermore, interpretable approaches to uncover biological mechanisms with these models. An interpretable multi-modal integration model is developed to simultaneously perform automated disease diagnosis and result interpretation. It is named Grad-CAM-guided convolutional collaborative learning (gCAM-CCL) is achieved by combining intermediate feature maps with gradient-based weights. The gCAM-CCL model can generate interpretable activation maps to quantify pixel-level contributions of the input features. Moreover, the estimated activation maps are class-specific, which can therefore facilitate the identification of biomarkers underlying different groups. The gCAM-CCL model on a large cohort of brain imaging-genomics study is applied and validated, and its applications are demonstrated to both the classification of cognitive function groups and the discovery of underlying biological mechanisms.