IMLP: An Energy-Efficient Continual Learning Method for Tabular Data Streams

19 Sept 2025 (modified: 12 Feb 2026)ICLR 2026 Conference Desk Rejected SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Continuous learning, energy efficiency, tabular data streams
Abstract: Tabular data streams are rapidly emerging as a dominant modality for real-time decision-making in healthcare, finance, and the Internet of Things (IoT). These applications commonly run on edge and mobile devices, where energy budgets, memory, and compute are strictly limited. Continual learning (CL) addresses such dynamics by training models sequentially on task streams while preserving prior knowledge and consolidating new knowledge. While recent CL work has advanced in mitigating catastrophic forgetting and improving knowledge transfer, the practical requirements of energy and memory efficiency for tabular data streams remain underexplored. In particular, existing CL solutions mostly depend on replay mechanisms whose buffers grow over time and exacerbate resource costs. We propose a \textit{context-aware incremental Multi-Layer Perceptron (IMLP)}, a compact continual learner for tabular data streams. IMLP incorporates a windowed scaled dot-product attention over a sliding latent feature buffer, enabling constant-size memory and avoiding storing raw data. The attended context is concatenated with current features and processed by shared feed-forward layers, yielding lightweight per-segment updates. \textcolor{blue}{We evaluate IMLP against state-of-the-art (SOTA) tabular models on real-world concept drift benchmark tabular datasets designed to assess models under temporal distribution shifts. Compared to TabPFNv2 under the \textit{incremental} concept drift, IMLP has 22.7\% total energy reduction while only a 0.05 final balanced accuracy drop. The results show that the proposed attention-based feature memory design can effectively guide the energy consumption while achieving the highest final accuracy in the abrupt concept drifts among all network baselines. }
Supplementary Material: zip
Primary Area: transfer learning, meta learning, and lifelong learning
Submission Number: 17137
Loading