RoBERTa Can Do More: Incorporating Syntax Into RoBERTa-based Sentiment Analysis Models Without Additional Computational CostsDownload PDF

Anonymous

03 Sept 2022 (modified: 05 May 2023)ACL ARR 2022 September Blind SubmissionReaders: Everyone
Abstract: We present a simple, but effective method to incorporate syntactic information obtained from dependency trees directly into transformer-based language models (e.g. RoBERTa) for tasks such as Aspect-Based Sentiment Classification (ABSC), where the desired output depends on specific input tokens. In contrast to prior approaches to ABSC that capture syntax by combining language models with graph neural networks over dependency trees, our model, Graph-integrated RoBERTa (GoBERTa) requires only a minimal increase in memory cost, training and inference time over the underlying language model. Yet, GoBERTa outperforms these more complex models, yielding new state-of-the-art results on ABSC.
Paper Type: long
0 Replies

Loading