bmH6UgE1z7@OpenReview

Total: 1

#1 Advancing Wasserstein Convergence Analysis of Score-Based Models: Insights from Discretization and Second-Order Acceleration [PDF1] [Copy] [Kimi] [REL]

Authors: Yifeng Yu, Lu Yu

Score-based diffusion models have emerged as powerful tools in generative modeling, yet their theoretical foundations remain underexplored. In this work, we focus on the Wasserstein convergence analysis of score-based diffusion models. Specifically, we investigate the impact of various discretization schemes, including Euler discretization, exponential integrators, and midpoint randomization methods. Our analysis provides the first quantitative comparison of these discrete approximations, emphasizing their influence on convergence behavior. Furthermore, we explore scenarios where Hessian information is available and propose an accelerated sampler based on the local linearization method. We establish the first Wasserstein convergence analysis for such a Hessian-based method, showing that it achieves an improved convergence rate of order $\widetilde{\mathcal{O}}\left(\frac{\sqrt{d}}{\varepsilon}\right)$, which significantly outperforms the standard rate $\widetilde{\mathcal{O}}\left(\frac{d}{\varepsilon^2}\right)$ of vanilla diffusion models. Numerical experiments on synthetic data and the MNIST dataset validate our theoretical insights.

Subject: NeurIPS.2025 - Poster