Abstract: While Network Architecture Search (NAS) and network pruning have recently re-emerged as powerful techniques to design more effective networks with less parameters, their focus has been on improving deep neural network models with millions of parameters. However, the number of connections and the connectivity pattern between layers also influences the learning performance of shallow two-layer networks, such as the classic Restricted Boltzmann Machine (RBM). This work presents a comprehensive study of the connectivity space on the learning performance of RBMs using the bars and stripes (BAS) model and the MNIST dataset. For BAS, the main findings show that pattern and number of connections play a fundamental role on the learning performance, while for MNIST the number of connections is more important for the considered connection structures. Moreover, the fully connected network is outperformed by different patterns in both scenarios, indicating the importance of also designing more effective connectivity patterns for simple two-layer neural networks.
0 Replies
Loading