PF-Training: Parameter Freezing for Efficient On-Device Training of CNN-based Object Detectors in Low-Resource Environments
Abstract: There has been active research focusing on lightweight approaches for on-device CNN training. Convolutional neural network (CNN) training requires a substantial amount of computation and memory footprint, particularly when compared with inference. However, in the case of on-device training, the available resources are limited, making it particularly challenging to train CNNs on-device. This study proposes a lightweight algorithm for CNN training in low-resource environments using parameter freezing techniques. The proposed method reduces the training load by employing a batch size of one and mitigates the computational overhead by using normalization freezing and modified weight optimization techniques. Furthermore, we propose a simple algorithm based on weight distribution to select the layers for freezing, thereby enabling efficient training. The proposed method is applied to Tiny-YOLOv3, demonstrating 52.10% computation reduction, 55.79% memory footprint reduction, and 21.95% accuracy improvement compared to the fully trained fine-tuned model.
Loading