Improved Learning of One-hidden-layer Convolutional Neural Networks with OverlapsDownload PDF

27 Sept 2018 (modified: 05 May 2023)ICLR 2019 Conference Blind SubmissionReaders: Everyone
Abstract: We propose a new algorithm to learn a one-hidden-layer convolutional neural network where both the convolutional weights and the outputs weights are parameters to be learned. Our algorithm works for a general class of (potentially overlapping) patches, including commonly used structures for computer vision tasks. Our algorithm draws ideas from (1) isotonic regression for learning neural networks and (2) landscape analysis of non-convex matrix factorization problems. We believe these findings may inspire further development in designing provable algorithms for learning neural networks and other complex models. While our focus is theoretical, we also present experiments that illustrate our theoretical findings.
Keywords: deep learning, parameter recovery, convolutional neural networks, non-convex optimization
TL;DR: We propose an algorithm for provably recovering parameters (convolutional and output weights) of a convolutional network with overlapping patches.
7 Replies

Loading