6AOnQ0KSUT@OpenReview

Total: 1

#1 Pruning for GNNs: Lower Complexity with Comparable Expressiveness [PDF2] [Copy] [Kimi] [REL]

Authors: Dun Ma, Jianguo Chen, Wenguo Yang, Suixiang Gao, Shengminjie Chen

In recent years, the pursuit of higher expressive power in graph neural networks (GNNs) has often led to more complex aggregation mechanisms and deeper architectures. To address these issues, we have identified redundant structures in GNNs, and by pruning them, we propose Pruned MP-GNNs, K-Path GNNs, and K-Hop GNNs based on their original architectures. We show that 1) Although some structures are pruned in Pruned MP-GNNs and Pruned K-Path GNNs, their expressive power has not been compromised. 2) K-Hop MP-GNNs and their pruned architecture exhibit equivalent expressiveness on regular and strongly regular graphs. 3) The complexity of pruned K-Path GNNs and pruned K-Hop GNNs is lower than that of MP-GNNs, yet their expressive power is higher. Experimental results validate our refinements, demonstrating competitive performance across benchmark datasets with improved efficiency.

Subject: ICML.2025 - Poster