b387eWFV3V@OpenReview

Total: 1

#1 Generalization Bounds for Kolmogorov-Arnold Networks (KANs) and Enhanced KANs with Lower Lipschitz Complexity [PDF] [Copy] [Kimi] [REL]

Authors: Pengqi Li, Lizhong Ding, Jiarun Fu, ChunhuiZhang, Guoren Wang, Ye Yuan

Kolmogorov-Arnold Networks (KANs) have demonstrated remarkable expressive capacity and predictive power in symbolic learning. However, existing generalization errors of KANs primarily focus on approximation errors while neglecting estimation errors, leading to a suboptimal bias-variance trade-off and poor generalization performance. Meanwhile, the unclear generalization mechanism hinders the design of more effective KANs variants. As the authors of KANs highlighted, they ``would like to explore ways to restrict KANs' hypothesis space so that they can achieve good performance''. To address these challenges, we explore the generalization mechanism of KANs and design more effective KANs with lower model complexity and better generalization. We define \textit{Lipschitz complexity} as the first structural measure for deep functions represented by KANs and derive novel generalization bounds based on \textit{Lipschitz complexity}, establishing a theoretical foundation for understanding their generalization behavior. To reduce \textit{Lipschitz complexity} and boost the generalization mechanism of KANs, we propose Lipschitz-Enhanced KANs ($\textbf{LipKANs}$) by integrating the Lip layer and pioneering the $L_{1.5}$-regularized loss, contributing to tighter generalization bounds. Empirical experiments validate that the proposed LipKANs enhance the generalization mechanism of KANs when modeling complex distributions. We hope our theoretical bounds and LipKANs lay a foundation for the future development of KANs.

Subject: NeurIPS.2025 - Poster