Zero-Shot Stance Detection in the Wild: Dynamic Target Generation and Multi-Target Adaptation

ACL ARR 2025 May Submission5036 Authors

20 May 2025 (modified: 03 Jul 2025)ACL ARR 2025 May SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Current stance detection research typically relies on predicting stance based on given targets and text. However, in real-world social media scenarios, targets are neither predefined nor static but rather complex and dynamic. To address this challenge, we propose a novel task: zero-shot stance detection in the wild with dynamic target generation and multi-target adaptation, which aims to automatically identify multiple target-stance pairs from text without prior target knowledge. We construct a Chinese social media stance detection dataset and design multi-dimensional evaluation metrics. We explore both integrated and two-stage fine-tuning strategies for large language models (LLMs) and evaluate various baseline models. Experimental results demonstrate that fine-tuned LLMs achieve superior performance on this task: the integrated fine-tuned Qwen2.5-7B attains the highest comprehensive target recognition score of 66.99\%, while the two-stage fine-tuned DeepSeek-R1-Distill-Qwen-7B achieves a stance detection F1 score of 79.26\%. The dataset and models are publicly available at: https://anonymous.4open.science/r/DGTA-stance-detection-7299.
Paper Type: Long
Research Area: Sentiment Analysis, Stylistic Analysis, and Argument Mining
Research Area Keywords: stance detection
Contribution Types: Reproduction study, Publicly available software and/or pre-trained models, Data resources
Languages Studied: English
Submission Number: 5036
Loading