G3r4nJXcpU@OpenReview

Total: 1

#1 Proper Hölder-Kullback Dirichlet Diffusion: A Framework for High Dimensional Generative Modeling [PDF] [Copy] [Kimi] [REL]

Authors: Wanpeng Zhang, Yuhao Fang, Xihang Qiu, Jiarong Cheng, Jialong Hong, Bin Zhai, Qing Zhou, Yao Lu, Ye Zhang, Chun Li

Diffusion-based generative models have long depended on Gaussian priors, with little exploration of alternative distributions. We introduce a Proper Hölder-Kullback Dirichlet framework that uses time-varying multiplicative transformations to define both forward and reverse diffusion processes. Moving beyond conventional reweighted evidence lower bounds (ELBO) or Kullback–Leibler upper bounds (KLUB), we propose two novel divergence measures: the Proper Hölder Divergence (PHD) and the Proper Hölder–Kullback (PHK) divergence, the latter designed to restore symmetry missing in existing formulations. When optimizing our Dirichlet diffusion model with PHK, we achieve a Fréchet Inception Distance (FID) of 2.78 on unconditional CIFAR-10. Comprehensive experiments on natural-image datasets validate the generative strengths of model and confirm PHK’s effectiveness in model training. These contributions expand the diffusion-model family with principled non-Gaussian processes and effective optimization tools, offering new avenues for versatile, high-fidelity generative modeling.

Subject: NeurIPS.2025 - Poster