0598-Paper1034@2025@MICCAI

Total: 1

#1 Multi-expert collaboration and knowledge enhancement network for multimodal emotion recognition [PDF] [Copy] [Kimi] [REL]

Authors: Wang Kun, Zhao Junyong, Zhang Liying, Zhu Qi, Zhang Daoqiang, Wang Kun, Zhao Junyong, Zhang Liying, Zhu Qi, Zhang Daoqiang

Emotion recognition leveraging multimodal data plays a pivotal role in human-computer interaction and clinical applications, such as depression, mania, Parkinson’s Disease, etc. However, existing emotion recognition methods are susceptible to heterogeneous feature representations across modalities. Additionally, complex emotions involve multiple dimensions, which presents challenges for achieving highly trustworthy decisions. To address these challenges, in this paper, we propose a novel multi-expert collaboration and knowledge enhancement network for multimodal emotion recognition. First, we devise a cross-modal fusion module to dynamically aggregate complementary features from EEG and facial expressions through attention-guided. Second, our approach incorporates a feature prototype alignment module to enhance the consistency of multimodal feature representations. Then, we design a prior knowledge enhancement module that injects original dynamic brain networks into feature learning to enhance the feature representation. Finally, we introduce a multi-expert collaborative decision module designed to refine predictions, enhancing the robustness of classification results. Experimental results on the DEAP dataset demonstrate that our proposed method surpasses several state-of-the-art emotion recognition techniques.

Subject: MICCAI.2025