PatchCat: Rethinking Temporal Tokenization in Time Series Forecasting

11 Sept 2025 (modified: 20 Jan 2026)ICLR 2026 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Time series forecasting, temporal tokenizer
Abstract: Temporal tokenization serves as a fundamental component in time series forecasting, transforming raw signals into token representations. Existing temporal tokenizers fall into three typical categories, mapping time series into tokens at the point-wise, patch-wise, or variable-wise level. Through a fair comparison, we observe that none of these paradigms simultaneously balance forecasting accuracy with computational efficiency. Motivated by the accuracy benefits of patch-wise tokenizers and the high efficiency of variable-wise tokenizers, we propose PatchCat, a competitive alternative. PatchCat segments the input time series into consecutive patches and concatenates these embeddings in chronological order. This workflow not only preserves local semantics and sequential information, but also enables univariate series to be compressed into a single token, achieving efficiency comparable to variable-wise methods. To further enhance representational capacity, we adopt a linearly increasing dimension allocation strategy and the variable-wise affine transformations. Experiments show that replacing the tokenizer in many existing methods with PatchCat can effectively improve prediction performance. To further leverage PatchCat's strengths, we develop PCMLP, a simple yet powerful model based on a multilayer perceptron. Extensive experiments across 13 challenging real-world datasets demonstrate that our approach achieves competitive performance compared to state-of-the-art methods.
Supplementary Material: zip
Primary Area: learning on time series and dynamical systems
Submission Number: 4008
Loading