Efficiently Adapting Traffic Pre-trained Models for Encrypted Traffic Classification

Published: 2024, Last Modified: 17 Jan 2026CSCWD 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Classification of encrypted traffic is essential to the security of collaborative systems. Traffic pre-trained models (TPTMs) have shown promising results on this task. Existing methods of TPTMs for encrypted traffic classification typically fine-tune the parameters of TPTM to adapt to different downstream tasks. However, this approach requires updating all parameters of the large-scale TPTM during the training stage and storing an entire new model for every task, which consumes significant computational and storage resources. This paper proposes the Efficient Adapter of Traffic (EAT), which improves the parameter efficiency of TPTMs. Specifically, it fixes the original parameters of the TPTM and inserts a learnable module between each layer of the TPTM. In the process of adapting to all downstream tasks, only the learnable module needs to be trained and stored, which makes our method more efficient in terms of computation and storage. To validate the effectiveness of our approach, we conducted experiments on various encrypted traffic datasets. The results demonstrate that our method achieves similar performance to the fine-tuning method by only adjusting 3.4% of the TPTM parameter quantity, which significantly reduces computational and storage resources.
Loading