oeMK0Js4lq@OpenReview

Total: 1

#1 Higher-Order Learning with Graph Neural Networks via Hypergraph Encodings [PDF1] [Copy] [Kimi] [REL]

Authors: Raphaël Pellegrin, Lukas Fesser, Melanie Weber

Higher-order information is crucial for relational learning in many domains where relationships extend beyond pairwise interactions. Hypergraphs provide a natural framework for modeling such relationships, which has motivated recent extensions of graph neural network (GNN) architectures to hypergraphs. Most of these architectures rely on message-passing to encode higher-order information. In this paper, we propose to instead use hypergraph-level encodings based on characteristics such as hypergraph Laplacians and discrete curvature notions. These encodings can be used on datasets that are naturally parametrized as hypergraphs and on graph-level datasets, which we reparametrize as hypergraphs to compute encodings. In both settings, performance increases significantly, on social networks by more than 10 percent. Our theoretical analysis shows that hypergraph-level encodings provably increase the representational power of message-passing graph neural networks beyond that of their graph-level counterparts. For complete reproducibility, we release our codebase: https://github.com/Weber-GeoML/Hypergraph_Encodings.

Subject: NeurIPS.2025 - Poster