Keywords: generalized zero-shot learning, knowledge graphs, graph neural networks
TL;DR: We propose a method to incorporate information from the structure of a knowledge graph for multi-label chest x-ray classification in a generalized zero shot learning setting.
Abstract: The presence of annotated datasets is crucial to the performance of modern machine learning algorithms. However, obtaining richly annotated datasets is not always possible, especially for novel or rare diseases. This becomes especially challenging in the realm of multi-label classification of chest radiographs, due to the presence of numerous unknown disease types and the limited information inherent to x-ray images. Ideally, we would like to develop models that can reliably label such unseen patterns (classes). In this work, we present a knowledge graph-based approach to predict such novel, unseen classes. Our method directly injects the semantic relationships between seen and unseen disease classes. Specifically, we propose a principled approach to parsing and processing a knowledge graph conditioned on the given task. We show that our method matches the labeling performance of the state-of-the-art while outperforming it on unseen classes by a substantial 2% gain on chest X-ray classification. Crucially, we demonstrate that embedding disease-specific knowledge as a graph provides inherent explainability.