Training a Restricted Boltzmann Machine for Classification by Labeling Model Samples

Published: 01 Jan 2015, Last Modified: 14 May 2024CoRR 2015EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: We propose an alternative method for training a classification model. Using the MNIST set of handwritten digits and Restricted Boltzmann Machines, it is possible to reach a classification performance competitive to semi-supervised learning if we first train a model in an unsupervised fashion on unlabeled data only, and then manually add labels to model samples instead of training data samples with the help of a GUI. This approach can benefit from the fact that model samples can be presented to the human labeler in a video-like fashion, resulting in a higher number of labeled examples. Also, after some initial training, hard-to-classify examples can be distinguished from easy ones automatically, saving manual work.
Loading