LAST SToP for Modeling Asynchronous Time Series

Published: 01 May 2025, Last Modified: 18 Jun 2025ICML 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
TL;DR: Using Large Language Models (LLMs) to model Asynchronous Time Series
Abstract: We present a novel prompt design for Large Language Models (LLMs) tailored to **Asynchronous Time Series**. Unlike regular time series, which assume values at evenly spaced time points, asynchronous time series consist of timestamped events occurring at irregular intervals, each described in natural language. Our approach effectively utilizes the rich natural language of event descriptions, allowing LLMs to benefit from their broad world knowledge for reasoning across different domains and tasks. This allows us to extend the scope of asynchronous time series analysis beyond forecasting to include tasks like anomaly detection and data imputation. We further introduce **Stochastic Soft Prompting**, a novel prompt-tuning mechanism that significantly improves model performance, outperforming existing finetuning methods such as QLORA. Through extensive experiments on real-world datasets, we demonstrate that our approach achieves state-of-the-art performance across different tasks and datasets.
Lay Summary: Most AI systems analyze data that arrives at regular intervals, like daily stock prices or hourly temperature readings. But many real-world events happen unpredictably — like medical emergencies, social media posts, or equipment failures — and are described in natural language rather than just numbers. Traditional methods struggle with this "asynchronous time series" data because they can't handle irregular timing and rich text descriptions together. We developed LASTS, a new approach that uses Large Language Models to analyze these irregular event sequences. Instead of forcing events into rigid categories, our method preserves their natural language descriptions, allowing the AI to use its understanding of language and world knowledge. We also created "Stochastic Soft Prompting," a finetuning technique that helps the LLMs understand our specific domain data much better than other famous finetuning techniques. Our approach significantly outperforms existing methods across multiple real-world datasets. This makes sophisticated time series analysis more accessible and could improve applications in healthcare monitoring, financial analysis, and social media understanding, helping organizations better predict and respond to irregular but important events.
Application-Driven Machine Learning: This submission is on Application-Driven Machine Learning.
Link To Code: https://github.com/BorealisAI/last-stop
Primary Area: Deep Learning->Sequential Models, Time series
Keywords: Large Language Models, Asynchronous Time Series, Time Series modeling, Deep Learning
Submission Number: 8410
Loading