Preventing Conflicting Gradients in Neural Temporal Point Process Models for Irregular Time Series Data

Published: 10 Oct 2024, Last Modified: 26 Nov 2024NeurIPS 2024 TSALM WorkshopEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Irregular Time Series, Neural Temporal Point Processes, Conflicting Gradients
TL;DR: We propose novel parametrizations of neural MTPP models that prevent the emergence of conflicting gradients when trained on irregular times series data.
Abstract: Neural Marked Temporal Point Processes (MTPP) are flexible models that are typically trained from large collections of sequences of irregularly-spaced labeled events. These models inherently learn two predictive distributions: one for the arrival times of events and another for the types of events, also known as marks. In this study, we demonstrate that learning an MTPP model can be framed as a two-task learning problem, where both tasks share a common set of trainable parameters that are optimized jointly. We show that this practice can lead to conflicting gradients during training, resulting in overall degraded performance with respect to both tasks. To overcome this issue, we introduce novel parametrizations for neural MTPP models that allow for separate modeling and training of each task, effectively avoiding the problem of conflicting gradients.
Submission Number: 23
Loading