Liu_Learning_To_Affiliate_Mutual_Centralized_Learning_for_Few-Shot_Classification@CVPR2022@CVF

Total: 1

#1 Learning To Affiliate: Mutual Centralized Learning for Few-Shot Classification [PDF] [Copy] [Kimi]

Authors: Yang Liu ; Weifeng Zhang ; Chao Xiang ; Tu Zheng ; Deng Cai ; Xiaofei He

Few-shot learning (FSL) aims to learn a classifier that can be easily adapted to accommodate new tasks, given only a few examples. To handle the limited-data in few-shot regimes, recent methods tend to collectively use a set of local features to densely represent an image instead of using a mixed global feature. They generally explore a unidirectional paradigm, e.g., find the nearest support feature for every query feature and aggregate these local matches for a joint classification. In this paper, we propose a novel Mutual Centralized Learning (MCL) to fully affiliate these two disjoint dense features sets in a bidirectional paradigm. We first associate each local feature with a particle that can bidirectionally random walk in a discrete feature space. To estimate the class probability, we propose the dense features' accessibility that measures the expected number of visits to the dense features of that class in a Markov process. We relate our method to learning a centrality on an affiliation network and demonstrate its capability to be plugged in existing methods by highlighting centralized local features. Experiments show that our method achieves the new state-of-the-art.