A Comparative Study of the Impact of Different First Order Optimizers on the Learning Process of UNet for Change Detection Task
Abstract: UNet is an encoder-decoder neural network that has been used to detect changes in remote-sensing images. This paper provides a comparative study on the performance of UNet when trained with different optimizers for the Change Detection task. Although several previous works aim to compare different UNet models for change detection, this paper is, as far as we know, the first work that investigates UNet regarding the optimization method that is used in the learning process. This can help designing of more efficient UNet models, especially with limited training resources. We compare five common gradient-based optimization techniques: Gradient descent with Momentum (Momentum GD), Nesterov Accelerated Gradient (NAG), Adaptive Gradient (AdaGrad), Root Mean Square Propagation optimizer (RMSProp), and the adaptive moment estimation optimizer (Adam). For this purpose, UNet is trained over 200 epochs using ONERA dataset for the optimization of the binary cross entropy. The model is assessed using three metrics: Accuracy, Precision, and F1-score. According to the obtained results, RMSProp, NAG, and AdaGrad reached the highest validation accuracies: 0.976, 0.978, and 0.979 with \(10^{-2}, 10^{-3}\) and \(10^{-4}\) respectively. Adam was the fastest to converge and scored the lowest validation loss. Moreover, Adam scored the highest precision and F1-score across all learning rates, with 0.491 and 0.376 respectively. We also note that both Momentum-based algorithms and adaptive algorithms perform better with relatively small learning rate values.
Loading