Multi-Time Attention Networks for Irregularly Sampled Time SeriesDownload PDF

Published: 06 Jul 2020, Last Modified: 29 Sept 2024ICML Artemiss 2020Readers: Everyone
TL;DR: This paper presents a new deep learning architecture for learning with sparse and irregularly sampled multivariate time series.
Keywords: irregular sampling, multivariate time series, attention, missing data
Abstract: Irregular sampling occurs in many time series modeling applications where it presents a significant challenge to standard deep learning models. This work is motivated by the analysis of physiological time series data in electronic health records, which are multivariate, sparse, irregularly sampled, and incompletely observed. In this paper, we propose a new deep learning framework for this setting that we call Multi-Time Attention Networks, which use embeddings and attention to produce fixed-dimensional representations of irregularly sampled multivariate time series. We evaluate this framework through applications to both interpolation and classification and show that it outperforms several recently proposed methods while offering significantly faster training times than current state-of-the-art approaches.
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/multi-time-attention-networks-for-irregularly/code)
3 Replies

Loading