Abstract: Designing effective neural networks from scratch for various time-series analysis tasks, such as activity recognition, fault detection, and traffic forecasting, is time-consuming and heavily relies on human labor. To reduce the reliance on human labor, recent studies adopt neural architecture search (NAS) to design neural networks for time series automatically. Still, existing NAS frameworks for time series only focus on one specific analysis, such as forecasting and classification, with expensive search methods. This paper therefore aims to build a unified zero-shot NAS framework that effectively searches neural architectures for a variety of tasks and time-series data. However, to build a general framework for different tasks, we need a zero-shot proxy that consistently correlates with the downstream performance across different characteristics of time-series datasets. To address these challenges, we propose a zero-shot NAS framework with novel Time-Frequency Aware Scoring for general time-series analysis, named TFAS. To incorporate TFAS into existing foundation time-series models, we adopt time-frequency decomposition methods and introduce the concept of augmented architecture to the foundation models. This augmented architecture enables the zero-shot proxy to be aware of the decomposed time and frequency information, resulting a more accurate estimation of downstream performance for particular datasets. Empirically, we show that the architectures found by TFAS gain improvement of up to 23.6% over state-of-the-art hand-crafted baselines in five mainstream time-series data mining tasks, including short- and long-term forecasting, classification, anomaly detection, and imputation.
External IDs:dblp:journals/ml/TriratL25
Loading