Lightweight Convolutional Neural Networks By Hypercomplex ParameterizationDownload PDF

Published: 28 Jan 2022, Last Modified: 13 Feb 2023ICLR 2022 SubmittedReaders: Everyone
Keywords: Hypercomplex Neural Networks, Lightweight Neural Networks, Quaternion Neural Networks, Parameterized Hypercomplex Convolutions, Hypercomplex Representation Learning
Abstract: Hypercomplex neural networks have proved to reduce the overall number of parameters while ensuring valuable performances by leveraging the properties of Clifford algebras. Recently, hypercomplex linear layers have been further improved by involving efficient parameterized Kronecker products. In this paper, we define the parameterization of hypercomplex convolutional layers to develop lightweight and efficient large-scale convolutional models. Our method grasps the convolution rules and the filters organization directly from data without requiring a rigidly predefined domain structure to follow. The proposed approach is flexible to operate in any user-defined or tuned domain, from 1D to $n$D regardless of whether the algebra rules are preset. Such a malleability allows processing multidimensional inputs in their natural domain without annexing further dimensions, as done, instead, in quaternion neural networks for 3D inputs like color images. As a result, the proposed method operates with $1/n$ free parameters as regards its analog in the real domain. We demonstrate the versatility of this approach to multiple domains of application by performing experiments on various image as well as audio datasets in which our method outperforms real and quaternion-valued counterparts.
One-sentence Summary: We propose a lightweight hypercomplex parameterization of convolutional layers which outperforms real and quaternion-valued neural networks in different application domains.
25 Replies

Loading