FO2fu3daSL@OpenReview

Total: 1

#1 Generative Modeling Reinvents Supervised Learning: Label Repurposing with Predictive Consistency Learning [PDF1] [Copy] [Kimi1] [REL]

Authors: Yang Li, Jiale Ma, Yebin Yang, Qitian Wu, Hongyuan Zha, Junchi Yan

Predicting labels directly from data has been the standard in label learning tasks, e.g., supervised learning, where models often prioritize feature compression and extraction from inputs under the assumption that label information is less complex. However, recent prediction tasks often face predicting complex labels, exacerbating the challenge of learning mappings from learned features to high-fidelity label representations. To this end, we draw inspiration from the consistency training concept in generative consistency models and propose predictive consistency learning (PCL), a novel learning paradigm that decomposes the full label information into a progressive learning procedure, mitigating the label capture challenge. Besides data inputs, PCL additionally receives input from noise-perturbed labels as an additional reference, pursuing predictive consistency across different noise levels. It simultaneously learns the relationship between latent features and a spectrum of label information, which enables progressive learning for complex predictions and allows multi-step inference analogous to gradual denoising, thereby enhancing the prediction quality. Experiments on vision, text, and graph tasks show the superiority of PCL over conventional supervised training in complex label prediction tasks.

Subject: ICML.2025 - Poster