44gnGhurnZ@OpenReview

Total: 1

#1 A Dynamical Systems-Inspired Pruning Strategy for Addressing Oversmoothing in Graph Attention Networks [PDF1] [Copy] [Kimi] [REL]

Authors: Biswadeep Chakraborty, Harshit Kumar, Saibal Mukhopadhyay

Graph Neural Networks (GNNs) face a critical limitation known as oversmoothing, where increasing network depth leads to homogenized node representations, severely compromising their expressiveness. We present a novel dynamical systems perspective on this challenge, revealing oversmoothing as an emergent property of GNNs' convergence to low-dimensional attractor states. Based on this insight, we introduce **DYNAMO-GAT**, which combines noise-driven covariance analysis with Anti-Hebbian learning to dynamically prune attention weights, effectively preserving distinct attractor states. We provide theoretical guarantees for DYNAMO-GAT's effectiveness and demonstrate its superior performance on benchmark datasets, consistently outperforming existing methods while requiring fewer computational resources. This work establishes a fundamental connection between dynamical systems theory and GNN behavior, providing both theoretical insights and practical solutions for deep graph learning.

Subject: ICML.2025 - Poster