SPOCC: Scalable POssibilistic Classifier Combination – toward robust aggregation of classifiersDownload PDF

16 May 2020OpenReview Archive Direct UploadReaders: Everyone
Abstract: We investigate a problem in which each member of a group of learners is trained separately to solve the same classification task. Each learner has access to a training dataset (possibly with overlap across learners) but each trained classifier can be evaluated on a validation dataset. We propose a new approach to aggregate the learner predictions in the possibility theory framework. For each classifier prediction, we build a possibility distribution assessing how likely the classifier prediction is cor- rect using frequentist probabilities estimated on the validation set. The possibility distributions are aggregated using an adaptive t-norm that can accommodate dependency and poor accuracy of the classifier predictions. We prove that the proposed approach possesses a number of desirable clas- sifier combination robustness properties. Moreover, the method is agnos- tic on the base learners, scales well in the number of aggregated classifiers and is incremental as a new classifier can be appended to the ensemble by building upon previously computed parameters and structures.
0 Replies

Loading