Multiple independent losses scheduling: A simple training method for deep neural networksDownload PDFOpen Website

2023 (modified: 25 Apr 2023)Intell. Data Anal. 2023Readers: Everyone
Abstract: In recent years, various loss functions have been proposed to boost the performance of deep neural networks. Every loss function has its own specific theoretical motivation, and can easily learn its preference features of training data compared with other loss functions. Thus, combining multiple loss functions to capture more data features becomes an attractive idea for model performance improvement. In this paper, instead of using a single loss function or a linear weighted sum of multiple loss functions, we present the method named Multiple Independent Losses Scheduling (MILS), which allows multiple loss functions to independently participate in the training process according to their performance. Specifically, for all candidate loss functions, one loss function will be predefined as the primary loss function before training, and the other loss functions will play auxiliary roles for possible contributions to improve the model performance. In order to avoid auxiliary loss functions bringing a negative effect on the model performance in the training process, we developed a simple but effective performance-based scheduling algorithm to prevent auxiliary loss functions from dragging down the model performance. Extensive experiments using various deep architectures on various recognition benchmarks demonstrate our scheme is simple, robust, lightweight, and effective for typical classification tasks.
0 Replies

Loading