EcoSta 2024: Start Registration
View Submission - EcoSta2024
A1018
Title: Limits and potentials of graph neural networks: From spectral simplicity to polynomial bounds Authors:  Arash Amini - UCLA (United States) [presenting]
Abstract: The interplay between complexity and efficiency is explored in graph neural networks (GNNs), drawing on theoretical and empirical insights. Empirical evidence shows how spectral GNNs for semi-supervised node classification benefit from simplicity. The need for complex GNN architectures is challenged, showing simpler methods can rival or exceed their performance. This motivates a theoretical deep dive into polynomial GNNs (Poly-GNNs), examining the effect of 'depth' (higher-order polynomial features) within a contextual stochastic block model. Contrary to expectations, it is discovered that increasing depth mainly modifies constants without altering the fundamental rates of separability. This finding emphasizes the profound effect of 'network noise' in deep GNNs and questions the prevailing belief that greater complexity necessarily improves discriminative power in GNNs.