Open Peer Review. Open Publishing. Open Access. Open Discussion. Open Directory. Open Recommendations. Open API. Open Source.
Relational Multi-Instance Learning for Concept Annotation from Medical Time Series
Sanjay Purushotham, Zhengping Che, Bo Jiang, Tanachat Nilanon, Yan Liu
Feb 15, 2018 (modified: Feb 15, 2018)ICLR 2018 Conference Blind Submissionreaders: everyoneShow Bibtex
Abstract:Recent advances in computing technology and sensor design have made it easier to collect longitudinal or time series data from patients, resulting in a gigantic amount of available medical data. Most of the medical time series lack annotations or even when the annotations are available they could be subjective and prone to human errors. Earlier works have developed natural language processing techniques to extract concept annotations and/or clinical narratives from doctor notes. However, these approaches are slow and do not use the accompanying medical time series data. To address this issue, we introduce the problem of concept annotation for the medical time series data, i.e., the task of predicting and localizing medical concepts by using the time series data as input. We propose Relational Multi-Instance Learning (RMIL) - a deep Multi Instance Learning framework based on recurrent neural networks, which uses pooling functions and attention mechanisms for the concept annotation tasks. Empirical results on medical datasets show that our proposed models outperform various multi-instance learning models.
TL;DR:We propose a deep Multi Instance Learning framework based on recurrent neural networks which uses pooling functions and attention mechanisms for the concept annotation tasks.
Keywords:Multi-instance learning, Medical Time Series, Concept Annotation
Enter your feedback below and we'll get back to you as soon as possible.