Cheng_A_Unified_Interpretation_of_Training-Time_Out-of-Distribution_Detection@ICCV2025@CVF

Total: 1

#1 A Unified Interpretation of Training-Time Out-of-Distribution Detection [PDF3] [Copy] [Kimi] [REL]

Authors: Xu Cheng, Xin Jiang, Zechao Li

This paper explains training-time out-of-distribution (OOD) detection from a novel view, i.e., interactions between different input variables of deep neural networks (DNNs). Specifically, we provide a unified understanding of the effectiveness of current training-time OOD detection methods, i.e., DNNs trained with these methods all encode more complex interactions for inference than those trained without training-time methods, which contributes to their superior OOD detection performance. We further conduct thorough empirical analyses and verify that complex interactions play a primary role in OOD detection, by developing a simple-yet-efficient method to force the DNN to learn interactions of specific complexities and evaluate the change of OOD detection performances. Besides, we also use interactions to investigate why near-OOD samples are more difficult to distinguish from in-distribution (ID) samples than far-OOD samples, mainly because compared to far-OOD samples, the distribution of interactions in near-OOD samples is more similar to that of ID samples. Moreover, we discover that training-time OOD detection methods can effectively decrease such similarities.

Subject: ICCV.2025 - Highlight