BSqf2k01ag@OpenReview

Total: 1

#1 Towards Graph Foundation Models: Learning Generalities Across Graphs via Task-Trees [PDF2] [Copy] [Kimi1] [REL]

Authors: Zehong Wang, Zheyuan Zhang, Tianyi MA, Nitesh Chawla, Chuxu Zhang, Yanfang Ye

Foundation models are pretrained on large-scale corpora to learn generalizable patterns across domains and tasks---such as contours, textures, and edges in images, or tokens and sentences in text. In contrast, discovering such generalities in graph-structured data, especially across heterogeneous graph tasks, remains an open challenge. To address this, we propose a novel approach to cross-task generalization in graphs via task-trees, which serve as unified learning instances aligning node-, edge-, and graph-level tasks. We theoretically analyze the stability, transferability, and generalization properties of task-trees, showing that pretraining a graph neural network (GNN) on diverse task-trees with a reconstruction objective induces transferable knowledge. This enables efficient adaptation to downstream tasks with minimal fine-tuning. To validate our framework, we introduce Graph Generality Identifier on Task-Trees (GIT), a graph foundation model that demonstrates strong performance on over 30 graphs across five domains via fine-tuning, in-context learning, and zero-shot generalization. Code and data are available at https://github.com/Zehong-Wang/GIT.

Subject: ICML.2025 - Poster