H9BdN4f2vz@OpenReview

Total: 1

#1 FedIGL: Federated Invariant Graph Learning for Non-IID Graphs [PDF2] [Copy] [Kimi] [REL]

Authors: Lingren Wang, Wenxuan Tu, Jiaxin Wang, Xiong Wang, Jieren Cheng, Jingxin Liu

Federated Graph Learning (FGL) shows superiority in cross-domain graph training while preserving data privacy. Existing approaches usually assume shared generic knowledge (e.g., prototypes, spectral features) via aggregating local structures statistically to alleviate structural heterogeneity. However, imposing overly strict assumptions about the presumed correlation between structural features and the global objective often fails in generalizing to local tasks, leading to suboptimal performance. To tackle this issue, we propose a **Fed**erated **I**nvariant **G**raph **L**earning (**FedIGL**) framework based on invariant learning, which effectively disrupts spurious correlations and further mines the invariant factors across different distributions. Specifically, a server-side global model is trained to capture client-agnostic subgraph patterns shared across clients, whereas client-side models specialize in client-specific subgraph patterns. Subsequently, without compromising privacy, we propose a novel Bi-Gradient Regularization strategy that introduces gradient constraints to guide the model in identifying client-agnostic and client-specific subgraph patterns for better graph representations. Extensive experiments on graph-level clustering and classification tasks demonstrate the superiority of FedIGL against its competitors.

Subject: NeurIPS.2025 - Poster