fMAihjfJij@OpenReview

Total: 1

#1 Primphormer: Efficient Graph Transformers with Primal Representations [PDF] [Copy] [Kimi] [REL]

Authors: Mingzhen He, Ruikai Yang, Hanling Tian, Youmei Qiu, Xiaolin Huang

Graph Transformers (GTs) have emerged as a promising approach for graph representation learning. Despite their successes, the quadratic complexity of GTs limits scalability on large graphs due to their pair-wise computations. To fundamentally reduce the computational burden of GTs, we propose a primal-dual framework that interprets the self-attention mechanism on graphs as a dual representation. Based on this framework, we develop Primphormer, an efficient GT that leverages a primal representation with linear complexity. Theoretical analysis reveals that Primphormer serves as a universal approximator for functions on both sequences and graphs, while also retaining its expressive power for distinguishing non-isomorphic graphs.Extensive experiments on various graph benchmarks demonstrate that Primphormer achieves competitive empirical results while maintaining a more user-friendly memory and computational costs.

Subject: ICML.2025 - Poster