dCcWKeO4y4@OpenReview

Total: 1

#1 Stability and Sharper Risk Bounds with Convergence Rate $\tilde{O}(1/n^2)$ [PDF2] [Copy] [Kimi1] [REL]

Authors: Bowei Zhu, Shaojie Li, Mingyang Yi, Yong Liu

Prior work (Klochkov \& Zhivotovskiy, 2021) establishes at most $O\left(\log (n)/n\right)$ excess risk bounds via algorithmic stability for strongly-convex learners with high probability. We show that under the similar common assumptions — Polyak-Lojasiewicz condition, smoothness, and Lipschitz continous for losses — rates of $O\left(\log^2(n)/n^2\right)$ are at most achievable. To our knowledge, our analysis also provides the tightest high-probability bounds for gradient-based generalization gaps in nonconvex settings.

Subject: NeurIPS.2025 - Poster