Influence-Aware Attention for Multivariate Temporal Point ProcessesDownload PDF

Published: 17 Mar 2023, Last Modified: 22 May 2023CLeaR 2023 PosterReaders: Everyone
Keywords: Graphical Event Model, Multivariate Temporal Point Process, Variational Inference, Transformer, Attention Mechanism
TL;DR: we propose a neural model Influence-Aware Attention for Multivariate Temporal Point Processes which leverage the powerful attention mechanism in transformers to capture temporal dynamics between event types using variational inference
Abstract: Identifying the subset of events that influence events of interest from continuous time datasets is of great interest in various applications. Existing methods however often fail to produce accurate and interpretable results in a time-efficient manner. In this paper, we propose a neural model – Influence-Aware Attention for Multivariate Temporal Point Processes (IAA-MTPPs) – which leverages the powerful attention mechanism in transformers to capture temporal dynamics between event types, which is different from existing instance-to-instance attentions, using variational inference while maintaining interpretability. Given event sequences and a prior influence matrix, IAA-MTPP efficiently learns an approximate posterior by an Attention-to-Influence mechanism, and subsequently models the conditional likelihood of the sequences given a sampled influence through an Influence-to-Attention formulation. Both steps are completed efficiently inside a B-block multi-head self-attention layer, thus our end-to-end training with parallelizable transformer architecture enables faster training compared to sequential models such as RNNs. We demonstrate strong empirical performance compared to existing baselines on multiple synthetic and real benchmarks, including qualitative analysis for an application in decentralized finance.
0 Replies

Loading