3iEWfZh5R2@OpenReview

Total: 1

#1 Learning to Generalize: An Information Perspective on Neural Processes [PDF] [Copy] [Kimi] [REL]

Authors: Hui Li, Huafeng Liu, Shuyang Lin, Jingyue Shi, Yiran Fu, Liping Jing

Neural Processes (NPs) combine the adaptability of neural networks with the efficiency of meta-learning, offering a powerful framework for modeling stochastic processes. However, existing methods focus on empirical performance while lacking a rigorous theoretical understanding of generalization. To address this, we propose an information-theoretic framework to analyze the generalization bounds of NPs, introducing dynamical stability regularization to minimize sharpness and improve optimization dynamics. Additionally, we show how noise-injected parameter updates complement this regularization. The proposed approach, applicable to a wide range of NP models, is validated through experiments on classic benchmarks, including 1D regression, image completion, Bayesian optimization, and contextual bandits. The results demonstrate tighter generalization bounds and superior predictive performance, establishing a principled foundation for advancing generalizable NP models.

Subject: NeurIPS.2025 - Poster