NCFT: Automatic Matching of Multimodal Image Based on Nonlinear Consistent Feature Transform

Published: 01 Jan 2022, Last Modified: 17 May 2025IEEE Geosci. Remote. Sens. Lett. 2022EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Automatic matching of multimodal images remains a critical challenging task in many remote sensing and computer vision applications. Due to significant nonlinear radiation distortions (NRDs) between multimodal images, it is difficult for many traditional feature matching methods which are sensitive to NRD to achieve satisfactory matching performance. To cope with this problem, this letter proposes a novel feature matching method named nonlinear consistent feature transform (NCFT) that is robust to large NRD. There are three main contributions to NCFT. First, we propose a new consistent feature map instead of image intensity for feature point detection and description, the consistent feature map encodes the structure information and provides a rich and robust feature. Second, we propose a mean-residual maximum index map (MR-MIM) for feature description and the MR-MIM is constructed from the Log-Gabor convolution sequence on the consistent feature map. Finally, the structure descriptors are built according to the MR-MIM, and multimodal image matching is achieved by computing the correspondence. The extensive experimental results demonstrate that NCFT can effectively overcome the problem of NRD, NCFT outperforms other state-of-the-art methods and improves the matching accuracy and robustness on different multimodal image datasets.
Loading