Open Peer Review. Open Publishing. Open Access. Open Discussion. Open Directory. Open Recommendations. Open API. Open Source.
Approximated Infomax Early Stopping: Revisiting Gaussian RBMs on Natural Images
Taichi Kiwaki, Takaki Makino, Kazuyuki Aihara
Dec 20, 2013 (modified: Dec 20, 2013)ICLR 2014 conference submissionreaders: everyone
Decision:submitted, no decision
Abstract:We pursue early stopping that helps Gaussian Restricted Boltzmann Machines (GRBMs) to gain good natural image representations in terms of overcompleteness and data fitting. GRBMs are widely considered as an unsuitable model for natural images because they gain non-overcomplete representations which include uniform filters that do not represent sharp edges. We have recently found that GRBMs once gain and subsequently lose sharp edge filters during their training, contrary to this common perspective. We attribute this phenomenon to a tradeoff between overcompleteness of GRBM representations and data fitting. To gain GRBM representations that are overcomplete and fit data well, we propose approximated infomax early stopping for GRBMs. The proposed method enables huge performance boosts of classifiers trained on GRBM representations.
Enter your feedback below and we'll get back to you as soon as possible.