AgileGCN: Accelerating Deep GCN with Residual Connections using Structured PruningDownload PDFOpen Website

Published: 01 Jan 2022, Last Modified: 15 May 2023MIPR 2022Readers: Everyone
Abstract: Deep Graph Convolutional Networks (GCNs) with multiple layers have been used for applications such as point cloud classification and semantic segmentation and achieved state-of-the-art results. However, they are computationally expensive and have a high run-time latency. In this paper, we propose AgileGCN, a novel framework to compress and accelerate deep GCN models with residual connections using structured pruning. Specifically, in each residual structure of a deep GCN, channel sampling and padding are applied to the input and output channels of a convolutional layer, respectively, to significantly reduce its floating point operations (FLOPs) and number of parameters. Experimental results on two benchmark point cloud datasets demonstrate that AgileGCN achieves significant FLOPs and parameters reduction while maintaining the performance of the unpruned models for both point cloud classification and segmentation.
0 Replies

Loading