pD5oklKrDV@OpenReview

Total: 1

#1 Improved Expressivity of Hypergraph Neural Networks through High-Dimensional Generalized Weisfeiler-Leman Algorithms [PDF1] [Copy] [Kimi] [REL]

Authors: Detian Zhang, Zhang Chengqiang, Yanghui Rao, Qing Li, Chunjiang Zhu

The isomorphism problem is a key challenge in both graph and hypergraph domains, crucial for applications like protein design, chemical pathways, and community detection.Hypergraph isomorphism, which models high-order relationships in real-world scenarios, remains underexplored compared to the graph isomorphism.Current algorithms for hypergraphs, like the 1-dimensional generalized Weisfeiler-Lehman test (1-GWL), lag behind advancements in graph isomorphism tests, limiting most hypergraph neural networks to 1-GWL's expressive power.To address this, we propose the high-dimensional GWL (k-GWL), generalizing k-WL from graphs to hypergraphs.We prove that k-GWL reduces to k-WL for simple graphs, and thus develop a unified isomorphism method for both graphs and hypergraphs. We also successfully establish a clear and complete understanding of the GWL hierarchy of expressivity, showing that (k+1)-GWL is more expressive than k-GWL with illustrative examples.Based on k-GWL, we develop a hypergraph neural network model named k-HNN with improved expressive power of k-GWL, which achieves superior performance on real-world datasets, including a 6\% accuracy improvement on the Steam-Player dataset over the runner-up.Our code is available at https://github.com/talence-zcq/KGWL.

Subject: ICML.2025 - Poster