Alleviating Catastrophic Interference in Online Learning via Varying Scale of Backward Queried Data

Published: 01 Jan 2021, Last Modified: 16 May 2025ICONIP (3) 2021EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: In recent years, connectionist networks have become a staple in real world systems due to their ability to generalize and find intricate relationships and patterns in data. One inherent limitation to connectionist networks, however, is catastrophic interference, an inclination to lose retention of previously formed knowledge when training with new data. This hindrance has been especially evident in online machine learning, where data is fed sequentially into the connectionist network. Previous methods, such as rehearsal and pseudo-rehearsal systems, have attempted to alleviate catastrophic interference by introducing past data or replicated data into the data stream. While these methods have proven to be effective, they add additional complexity to the model and require the saving of previous data.
Loading