Being More Lightweight and Practical: Mini-sized Contrastive Learning Pre-trained Models for Fine-grained Traffic Tasks
Keywords: Fine-grained traffic prediction, Spatio-temporal modeling, Lightweight models
TL;DR: We propose a mini-sized pre-training model for fine-grained traffic. The model provides an efficient and accurate solution for fine-grained traffic with limited computational resources and data.
Abstract: Fine-grained traffic prediction is critically important for mitigating traffic congestion in key urban areas and for providing lane-change guidance in autonomous vehicles and navigation systems. However, task-specific models are not efficient enough, city-scale pre-trained models often overlook fine-grained requirements, and the demand for extensive computational resources hinders practical deployment. To address this issue, we developed a lightweight pre-training framework, MiniTraffic. This framework leverages abundant road-level data to address lane-level data scarcity through a frequency domain stability augmentation module and captures road-lane correlations via contrastive clustering to construct small-scale graph structures, significantly reducing model parameters. Fine-tuning with minimal target data provides a unified and efficient solution for fine-grained traffic prediction. In multi-granularity traffic prediction tasks across six fine-grained datasets, MiniTraffic demonstrated superior performance compared to all existing baseline models. The MiniTraffic-related code, datasets, and pre-trained models are available at https://anonymous.4open.science/r/MiniTraffic-ICLR26/.
Supplementary Material: zip
Primary Area: transfer learning, meta learning, and lifelong learning
Submission Number: 12155
Loading