s14pdQgoLb@OpenReview

Total: 1

#1 Greed is Good: A Unifying Perspective on Guided Generation [PDF] [Copy] [Kimi] [REL]

Authors: Zander W. Blasingame, Chen Liu

Training-free guided generation is a widely used and powerful technique that allows the end user to exert further control over the generative process of flow/diffusion models. Generally speaking, two families of techniques have emerged for solving this problem for *gradient-based guidance*: namely, *posterior guidance* (*i.e.*, guidance via projecting the current sample to the target distribution via the target prediction model) and *end-to-end guidance* (*i.e.*, guidance by performing backpropagation throughout the entire ODE solve). In this work, we show that these two seemingly separate families can actually be *unified* by looking at posterior guidance as a *greedy strategy* of *end-to-end guidance*. We explore the theoretical connections between these two families and provide an in-depth theoretical of these two techniques relative to the *continuous ideal gradients*. Motivated by this analysis we then show a method for *interpolating* between these two families enabling a trade-off between compute and accuracy of the guidance gradients. We then validate this work on several inverse image problems and property-guided molecular generation.

Subject: NeurIPS.2025 - Poster