Abstract: Adverse meteorological conditions, particularly fog and rain, present significant challenges to computer vision algorithms and autonomous systems. This work presents MuFoRa a novel, controllable, and measured multimodal dataset recorded at CARISSMA’s indoor test facility, specifically designed to assess perceptual difficulties in foggy and rainy environments. The dataset bridges research gap in the public benchmarking datasets, where quantifiable weather parameters are lacking. The proposed dataset comprises synchronized data from two sensor modalities: RGB stereo cameras and LiDAR sensors, captured under varying intensities of fog and rain. The dataset incorporates synchronized meteorological annotations, such as visibility through fog and precipitation levels of rain, and the study contributes a detailed explanation of the diverse weather effects observed during data collection in the methods section. The dataset’s utility is demonstrated through a baseline evaluation example, asse
Loading