Keywords: Test Time Adaptation, LLM, Temporal Point Processes
TL;DR: Efficient Test-Time Adaptation for Event Prediction
Abstract: Event prediction is central to applications from e-commerce to finance, yet real-world event streams are highly non-stationary, making deployment of frozen pre-trained models brittle. We propose Event prediction Test-Time Adaptation (ETTA), a model-agnostic lightweight framework that enables event-prediction models to adapt on-the-fly during inference, without retraining the model itself. Unlike model fine-tuning at test-time, ETTA introduces compact temporal and logit calibration modules that calibrate the inter-event timestamps and event type logits respectively. These adapters can be seamlessly applied to both neural temporal point process (TPP) models and large language models (LLMs). On five real-world benchmarks, ETTA consistently improves TPP models, achieving up to 20.3% reduction in RMSE and 4.9% higher accuracy, while for LLM-based event prediction it yields up to 23.1% RMSE reduction and 9.0% accuracy improvement. Together, these results establish test-time adaptation, and ETTA in particular, as a powerful, general framework for robust event prediction under distribution shift.
Submission Number: 38
Loading