Layer Sparsity in Neural NetworksDownload PDFOpen Website

Published: 01 Jan 2020, Last Modified: 12 May 2023CoRR 2020Readers: Everyone
Abstract: Sparsity has become popular in machine learning, because it can save computational resources, facilitate interpretations, and prevent overfitting. In this paper, we discuss sparsity in the framework of neural networks. In particular, we formulate a new notion of sparsity that concerns the networks' layers and, therefore, aligns particularly well with the current trend toward deep networks. We call this notion layer sparsity. We then introduce corresponding regularization and refitting schemes that can complement standard deep-learning pipelines to generate more compact and accurate networks.
0 Replies

Loading