Keywords: CNN, Forward-Forward Algorithm, Back-propagation, Chest X-ray, Pneumonia
TL;DR: We present a multistage (local, global) forward-forward contrastive pretraining strategy for state-of-the-art models demonstrating improved performance on medical image classification
Abstract: Medical image classification is one of the most important tasks for computer-aided diagnosis. Deep learning models, particularly convolutional neural networks, have been successfully used for disease classification from medical images, facilitated by automated feature learning. However, the diverse imaging modalities and clinical pathology make it challenging to construct generalized and robust classifications. Towards improving the model performance, we propose a novel pretraining approach, namely \textbf{Forward Forward Contrastive Learning (FFCL)}, which leverages the Forward-Forward Algorithm in a contrastive learning framework--both locally and globally. Our experimental results on the chest X-ray dataset indicate that the proposed FFCL achieves superior performance (\textbf{3.69\%} accuracy over ImageNet pretrained ResNet-18) over existing pretraining models in the pneumonia classification task. Moreover, extensive ablation experiments support the particular local and global contrastive pretraining design in FFCL.
2 Replies
Loading