Abstract: Multimodal predictive models have gained significant attention in recent years due to their ability to leverage data from multiple modalities, enabling a richer representation of the underlying phenomena. Traditional fusion methods face challenges in dealing with the curse of dimensionality and increased computational complexity. In this paper, we introduce a novel meta-learning model called Classification with Meta-Learning and Multimodal Stratified Time Series Forest (MET-FORC) to enhance multimodal fusion in classification tasks. METFORC utilizes a decision tree to learn from the scores generated by Time Series Forest (TSF) classifiers, enabling the effective aggregation of multimodal information. By addressing the limitations of traditional fusion techniques, METFORC offers improved classification accuracy. We validate the performance of our proposed model on three benchmark datasets from the solar weather prediction domain, showcasing its superiority over existing fusion approaches. Our findings demonstrate the effectiveness of METFORC in handling high-dimensional multimodal data and its potential for enhancing decision-making in complex tasks.
Loading