MASFNet: Multiscale Adaptive Sampling Fusion Network for Object Detection in Adverse Weather

Published: 01 Jan 2025, Last Modified: 26 Jul 2025IEEE Trans. Geosci. Remote. Sens. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Object detection methods using deep convolutional neural networks (CNNs) have derived major advances in normal images. However, such success is hardly achieved with adverse weather due to a lack of visibility. To tackle this problem, we propose a multiscale adaptive sampling fusion network, named MASFNet. In this article, we design a feature adaptive enhancement network (FAENet) consisting of three modules to adaptively perform feature enhancement on feature maps in adverse scenarios. These modules in FAENet are integrated by the Laplace pyramid, which can perform receptive field fusion, attention perception, and affine transformation for image feature enhancement. To improve the detection performance, we propose a multiscale sampling fusion pyramid network (MSFNet), which is capable of fusing different scale features to improve the semantic information. Experimental results demonstrate that the MASFNet achieves 73.68% and 30.95% mAP on the real scene fog dataset (RTTS) and foggy driving dataset (FDD), respectively. Additionally, on the real-world scenario low illumination dataset (ExDark), the MASFNet attains a substantial mAP of 63.80%, surpassing current state-of-the-art (SOTA) object detectors while retaining lightweight and high speed. The source code will be released at https://github.com/PolarisFTL/MASFNet
Loading