Keywords: Pre-trained language model, Prompt engineering, Zero-shot learning, Time series
TL;DR: The pre-trained language model with prompt engineering is construcred for zero-shot time series classification.
Abstract: This study constructs the LanguAge Model with Prompt EngineeRing (LAMPER) framework, designed to systematically evaluate the adaptability of pre-trained language models (PLMs) in accommodating diverse prompts and their integration in zero-shot time series (TS) classification. We deploy LAMPER in experimental assessments using 128 univariate TS datasets sourced from the UCR archive. Our findings indicate that the feature representation capacity of LAMPER is influenced by the maximum input token threshold imposed by PLMs.
Submission Number: 143
Loading