waeJHU2oeI@OpenReview

Total: 1

#1 Ensemble Distribution Distillation via Flow Matching [PDF1] [Copy] [Kimi1] [REL]

Authors: Jonggeon Park, Giung Nam, Hyunsu Kim, Jongmin Yoon, Juho Lee

Neural network ensembles have proven effective in improving performance across a range of tasks; however, their high computational cost limits their applicability in resource-constrained environments or for large models. Ensemble distillation, the process of transferring knowledge from an ensemble teacher to a smaller student model, offers a promising solution to this challenge. The key is to ensure that the student model is both cost-efficient and achieves performance comparable to the ensemble teacher. With this in mind, we propose a novel ensemble distribution distillation method, which leverages flow matching to effectively transfer the diversity from the ensemble teacher to the student model. Our extensive experiments demonstrate the effectiveness of our proposed method compared to existing ensemble distillation approaches.

Subject: ICML.2025 - Poster