T$^3$RD: Test-Time Training for Rumor Detection on Social Media

Published: 23 Jan 2024, Last Modified: 23 May 2024TheWebConf24 OralEveryoneRevisionsBibTeX
Keywords: fake news detection, test-time training, rumor classification, graph network
Abstract: With the increasing number of news uploaded on the internet daily, rumor detection has garnered significant attention in recent years. Existing rumor detection approaches excel on familiar topics since there is enough data (high resource) collected from the same domain for model training. However, they are poor at detecting rumors about emergent events especially those propagated in different languages due to the lack of training data and prior knowledge (low resource). To tackle this challenge, we introduce the Test-Time Training for Rumor Detection (T$^3$RD) to enhance the performance of rumor detection models on low-resource datasets. Specifically, we introduce self-supervised learning (SSL) as an auxiliary task in the test-time training (TTT). It consists of local and global contrastive learning (CL), in which the local CL focuses on acquiring invariant node representations and the global CL focuses on obtaining invariant graph representations. We employ auxiliary SSL tasks for both the training and test-time training phase to give a hint about the underlying traits of test samples, subsequently leveraging this hint to calibrate the trained model for these test samples. To mitigate the risk of distribution distortion in test-time training, we introduce a feature alignment aimed at achieving a balanced synergy between the knowledge derived from the training set and the test samples. The experiments conducted on the two widely used cross-domain datasets show that our proposed model achieves state-of-the-art performance. We also provide abundant ablation studies to verify the effectiveness of our methods.
Track: Social Networks, Social Media, and Society
Submission Guidelines Scope: Yes
Submission Guidelines Blind: Yes
Submission Guidelines Format: Yes
Submission Guidelines Limit: Yes
Submission Guidelines Authorship: Yes
Student Author: No
Submission Number: 713
Loading