Abstract: We present a novel bio-inspired semi-supervised learning strategy for semantic segmentation architectures. It is based on the so-called Hebbian principle “neurons that fire together wire together” that closely mimics brain synaptic adaptations and provides a promising biologically-plausible local learning rule for updating neural network weights without needing supervision. Our approach includes two stages. In the first step, we exploit the Hebbian principle for unsupervised weights updating of both convolutional and, for the first time, transpose-convolutional layers characterizing downsampling-upsampling semantic segmentation architectures. Then, in the second stage, we fine-tune the model on a few labeled data samples. We assess our methodology through an experimental evaluation involving several collections of biomedical images, deeming that this context is of outstanding importance in computer vision and is particularly affected by data scarcity. Preliminary results demonstrate the effectiveness of our proposed method compared with SOTA under various labeled training data regimes. The code to reproduce our experiments is available at: https://tinyurl.com/ycywfjc2.
Loading