GAME: GAussian Mixture Error-based meta-learning architectureDownload PDFOpen Website

Published: 01 Jan 2023, Last Modified: 06 Nov 2023Neural Comput. Appl. 2023Readers: Everyone
Abstract: In supervised learning, the gap between the truth label and the model output is always portrayed by an error function, and a fixed error function corresponds to a specific noise distribution that provides for model optimization. However, the actual noise usually has a much more complex structure. To be better fit for it, in this paper, we propose a robust noise model that embeds a mixture of Gaussian (MoG) noise modeling strategy into a baseline classification model, which is selected as the Gaussian mixture model (GMM) here. Further, to facilitate the automatic selection of the number of mixture components, we apply the penalized likelihood method. Then, we utilize an alternative strategy to update the parameters of the noisy model and the basic GMM classifier. From the meta-learning perspective, the proposed model offers a novel approach to defining the hyperparameters from the error representation. Finally, we compare the proposed approach with three conventional and related classification methods on the synthetic, two benchmark handwriting recognition datasets and the Yale Face dataset. In addition, we embed the noise modeling strategy into the semantic segmentation task. The numerical results validate that our approach achieves the best performance and the efficiency of MoG noise modeling.
0 Replies

Loading