Collaborative Kernel Discriminant Analysis for Large Scale Multi Class ProblemsOpen Website

Published: 2022, Last Modified: 14 Mar 2024SADASC 2022Readers: Everyone
Abstract: The use of kernel learning methods in large-scale contexts is still today rather limited. Indeed in this case, the memory and computing footprint of the kernel matrix can be a constraining factor. Among these methods, Kernel Discriminant Analysis (KDA) is no exception to this rule. In this paper, we present a new learning strategy to solve this issue. Instead of entrusting the entire learning task to a single classifier, we propose to share it with an ensemble of classifiers with a limited storage capacity. Our contribution relies on several points: firstly, our ensemble learning algorithm is “dynamic”, i.e. a new classifier is initialized when the previous one has an overflow of its rated capacity. Secondly, a compression strategy is proposed after each learning step in order to minimize the number of classifiers generated during the learning process. based on the overall classification cost of the network allows to reduce the size of the kernel matrix build in the last layer. To our knowledge, this strategy of collaboration between classifiers is new and allows to share the computational and storage burden between several classifiers while insuring the compression of the kernel matrix. In our study, our network will be based on the spectral regression kernel discriminant analysis (SRKDA) which is an efficient multi-class classifier. Extensive experiments on several large-scale data sets show the effectiveness of the proposed algorithm.
0 Replies

Loading