Unified Transformer Framework for Active Adaptation to Concept Drift in Time Series Forecasting

28 Apr 2026 (modified: 28 Apr 2026)THU 2026 Spring ANM SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Time series forecasting, test-time adaptation, concept drift detection, time series clustering
TL;DR: Current time series models fail under concept drift. This paper presents a transformer-based framework that detects drift early by monitoring attention dynamics, then adapts using LoRA—no delayed labels or heavy memory required.
Abstract: Modern time series forecasting models struggle with real-world non-stationarity, particularly **concept drift**—where the underlying relationship between historical data and future values changes over time. Existing test-time adaptation (TTA) methods are either computationally expensive, require unavailable ground-truth labels, or suffer from catastrophic forgetting. This proposal introduces a **unified Transformer-based framework** that couples concept drift detection directly with an adaptation mechanism within a single architecture. The key novelty lies in leveraging **attention dynamics** as an early warning system for drift, combined with **Low-Rank Adaptation (LoRA)** for parameter-efficient, real-time model updates without catastrophic forgetting. The framework will be evaluated on drift-specific datasets (USC-HAD, CaDrift) and four long-horizon forecasting benchmarks (ETT, Weather, Traffic, Exchange), comparing detection accuracy, adaptation latency, and forecasting error (MAE/MSE) against state-of-the-art methods including CEP, Proceed, TAFAS, and LEAF. The project aims to deliver a closed-loop system that maintains forecasting accuracy across non-stationary data regimes.
Submission Number: 9
Loading