Abstract: Minimum Complexity Machine (MCM) tends to minimize a tighter bound on the Vapnik-Chervonenkis (VC) dimension when compared to traditional Support Vector Machine (SVM) leading to a sparser and more generalizable classifier. However, it assumes the symmetric data distribution of classes and also suffers from scaling issues as solving the underlying optimization problem for large training data become in-feasible. In this paper, we first introduce an Asymmetric Minimum Complexity Machine (AMCM) that avoid the symmetric scatter assumption of underlying data to capture the class distribution in a more robust manner. We, further, introduce a solution to handle large-size data classification with AMCM using Stochastic Quasi-Newton method based optimization model which has been termed as Stochastic Quasi-Newton Method based Asymmetric Minimum Complexity Machine (SQN-AMCM). The use of the underlying stochastic framework in AMCM makes the inherent optimization scalable while according the properties such as fast convergence, and stability toward noise and re-sampling during the learning process. It simultaneously ensures comparable generalization ability of the resulting classifier. Further, despite being second-order gradient approach, the use of Hessian approximation technique using incremental updates significantly reduces the training time and memory requirement of SQN-AMCM. Experimental results on several benchmark datasets and activity recognition application show the relative out-performance of SQN-AMCM with the related classifiers in terms of speed and accuracy.
Loading