Abstract: Remote scene classification serves a vital role in many applications. However, satellite images are often blurred and degraded due to aerosol scattering under fog, haze, and other weather conditions, reducing the image contrast and color fidelity. State-of-the-art remote sensing classification models building upon convolutional neural networks (CNNs) are mostly trained on annotated datasets of clear satellite images. When applied to blurred images, they will suffer a great degradation in performance. To address this problem, we adopt the domain adaptation algorithm TADA and propose Transferable Attention enhanced Adversarial Adaptation Network (TA <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">3</sup> N), which utilizes annotated data in clear images by applying knowledge transferring from clear image domain to blurred image domain. Our TA <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">3</sup> N first integrates spatial attention to focus on salient areas which are discriminative and transferable. In addition, domain discriminator and adversarial training via gradient reversal layer are used to minimize the discrepancies in extracted features from clear and degraded domains. We synthesize degraded remote scene classification dataset SSI based on FoHIS model. Experiments on degraded SSI showed that TA <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">3</sup> N significantly outperforms baseline and other state-of-the-art domain adaptation methods.
0 Replies
Loading