Abstract: Convolutional Neural Networks (CNNs) have achieved tremendous success for many computer vision tasks, which shows a promising perspective of deploying CNNs on mobile platforms. An obstacle to this promising perspective is the tension between intensive resource consumption of CNNs and limited resource budget on mobile platforms. Existing works generally utilize a simpler architecture with lower accuracy for a higher energy-efficiency, \textit{i.e.}, trading accuracy for resource consumption. An emerging opportunity to both increasing accuracy and decreasing resource consumption is \textbf{class skew}, \textit{i.e.}, the strong temporal and spatial locality of the appearance of classes. However, it is challenging to efficiently utilize the class skew due to both the frequent switches and the huge number of class skews. Existing works use transfer learning to adapt the model towards the class skew during runtime, which consumes resource intensively. In this paper, we propose \textbf{probability layer}, an \textit{easily-implemented and highly flexible add-on module} to adapt the model efficiently during runtime \textit{without any fine-tuning} and achieving an \textit{equivalent or better} performance than transfer learning. Further, both \textit{increasing accuracy} and \textit{decreasing resource consumption} can be achieved during runtime through the combination of probability layer and pruning methods.
Keywords: Class skew, Runtime adaption
Data: [CIFAR-100](https://paperswithcode.com/dataset/cifar-100), [ImageNet](https://paperswithcode.com/dataset/imagenet)
8 Replies
Loading