Complex-valued Parallel Convolutional Recurrent Neural Networks for Automatic Modulation Classification

Published: 01 Jan 2022, Last Modified: 13 Nov 2024CSCWD 2022EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Following the great success of deep learning in signal processing, Many models based on real-valued convolutional neural networks (CNNs) and recurrent neural networks (RNNs) have been proposed for automatic modulation classification (AMC). However, the modulation signal is not only temporally dependent but also complex-valued data. The real-valued deep learning models treat the real and imaginary parts of the complex-valued modulation signal as two independent real-valued inputs, which destroy the structure of the raw signal data and make the model more uninterpretable. Thus, this paper proposes a novel complex-valued parallel convolutional recurrent neural network (CPCRNN) specifically for AMC. CPCRNN combines parallel complex-valued CNN and RNN with redesigned complex-valued activation function and complex-valued max pooling. The model directly feeds the complex-valued raw signal to the complex-valued CNN to obtain the complex-valued feature maps, which are then transformed into amplitude and phase and fed to the RNN. Our model can first extract the complex-valued features of the modulation signal with complex-valued CNNs, and then extract the temporal features of the modulation signal with RNNs. CPCRNN achieved an overall accuracy of 62.29% and 69.93% on the benchmark datasets RadioML2016.10A and RadioML2018.01-simple, respectively, outperforming all the state-of-the-art algorithms.
Loading