Extending Unsupervised Neural Image Compression With Supervised Multitask LearningDownload PDF

Published: 18 Apr 2020, Last Modified: 05 May 2023MIDL 2020Readers: Everyone
Abstract: We focus on the problem of training convolutional neural networks on gigapixel histopathology images to predict image-level targets. For this purpose, we extend Neural Image Compression (NIC), an image compression framework that reduces the dimensionality of these images using an encoder network trained unsupervisedly. We propose to train this encoder using supervised multitask learning (MTL) instead. We applied the proposed MTL NIC to two histopathology datasets and three tasks. First, we obtained state-of-the-art results in the Tumor Proliferation Assessment Challenge of 2016 (TUPAC16). Second, we successfully classified histopathological growth patterns in images with colorectal liver metastasis (CLM). Third, we predicted patient risk of death by learning directly from overall survival in the same CLM data. Our experimental results suggest that the representations learned by the MTL objective are: (1) highly specific, due to the supervised training signal, and (2) transferable, since the same features perform well across different tasks. Additionally, we trained multiple encoders with different training objectives, e.g. unsupervised and variants of MTL, and observed a positive correlation between the number of tasks in MTL and the system performance on the TUPAC16 dataset.
Paper Type: well-validated application
TL;DR: Extended the Neural Image Compression framework by training the encoder with supervised multitask learning. We trained a CNN classifier on compressed gigapixel histopathology images and achieved SOTA in public benchmark (TUPAC16).
Track: full conference paper
Keywords: Neural image compression, supervised multitask learning, histopathology
Source Latex: zip
Presentation Upload: zip
Presentation Upload Agreement: I agree that my presentation material (videos and slides) will be made public.
12 Replies

Loading