IMWA: Iterative Model Weight Averaging benefits class-imbalanced learning

Published: 01 Jan 2025, Last Modified: 16 May 2025Pattern Recognit. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Highlights•We find that vanilla MWA performs well for class-imbalanced tasks and that early-epoch averaging yields greater gains, inspiring the design of IMWA.•IMWA iteratively conducts parallel training and weight averaging, and its integration with EMA shows their complementary benefits.•Extensive experiments demonstrate that IMWA outperforms vanilla MWA and effectively boosts performance for class-imbalanced learning.
Loading