Luo_Adding_Additional_Control_to_One-Step_Diffusion_with_Joint_Distribution_Matching@ICCV2025@CVF

Total: 1

#1 Adding Additional Control to One-Step Diffusion with Joint Distribution Matching [PDF] [Copy] [Kimi] [REL]

Authors: Yihong Luo, Tianyang Hu, Yifan Song, Jiacheng Sun, Zhenguo Li, Jing Tang

While diffusion distillation has enabled one-step generation through methods like Variational Score Distillation, adapting distilled models to emerging *new controls* -- such as novel structural constraints or latest user preferences -- remains challenging. Conventional approaches typically requires modifying the base diffusion model and redistilling it -- a process that is both computationally intensive and time-consuming. To address these challenges, we introduce Joint Distribution Matching (JDM), a novel approach that minimizes the reverse KL divergence between image-condition joint distributions. By deriving a tractable upper bound, JDM decouples fidelity learning from condition learning. This asymmetric distillation scheme enables our one-step student to handle controls unknown to the teacher model and facilitates improved classifier-free guidance (CFG) usage and seamless integration of human feedback learning (HFL). Experimental results demonstrate that JDM surpasses baseline methods such as multi-step ControlNet by mere one-step in most cases, while achieving state-of-the-art performance in one-step text-to-image synthesis through improved usage of CFG or HFL integration.

Subject: ICCV.2025 - Poster