Abstract: Semi-supervised learning from limited quantities of labeled data, an alternative to fully-supervised schemes, benefits by maximizing knowledge gains from copious unlabeled data. Furthermore, learning multiple tasks within the same model improves model generalizability. We propose MultiMix, a novel multitask learning model that jointly learns disease classification and anatomical segmentation in a sparingly supervised manner, while preserving explainability through bridge saliency between the two tasks. Extensive experimentation with varied quantities of labeled data in the training sets affirms the effectiveness of our multitasking model in classifying pneumonia and segmenting lungs from chest X-ray images. Moreover, both in-domain and cross-domain evaluations across the tasks further showcase the potential of our model to adapt to challenging generalization scenarios.
0 Replies
Loading