Adversarial Training in Continuous-Time Models and Irregularly Sampled Time-Series

Published: 20 Jun 2023, Last Modified: 07 Aug 2023AdvML-Frontiers 2023EveryoneRevisionsBibTeX
Keywords: Adversarial training, continuous-time models, irregularly sampled time series
TL;DR: First paper to apply adversarial training methods for continuous-time models and irregularly sampled time series
Abstract: This study presents the first steps of exploring the effects of adversarial training on continuous-time models and irregularly sampled time series data. Historically, these models and sampling techniques have been largely neglected in adversarial learning research, leading to a significant gap in our understanding of their performance under adversarial conditions. To address this, we conducted an empirical study of adversarial training techniques applied to time-continuous model architectures and sampling methods. Our findings suggest that while standard continuous-time models tend to outperform their discrete counterparts (especially on irregularly sampled datasets), this performance advantage diminishes almost entirely when adversarial training is employed. This indicates that adversarial training may interfere with the time-continuous representation, effectively neutralizing the benefits typically associated with these models. We believe these insights will be critical in guiding further advancements in adversarial learning research for continuous-time models.
Submission Number: 92
Loading