A Comprehensive Analysis of the Quantum-like Approach for Integrating Syntactic and Semantic Information
Abstract: Transformers have proved effectiveness in understanding and deciphering the intricate context of languages. This success is achieved by those models that lack explicit modeling of syntactic structures, which were hypothesised by decades of computational linguistic research to be necessary for logical text understanding.
In this work, we present a comprehensive analysis of syntactic and semantic context integration by proposing Compressed Phrase Embedding and adopting quantum-like methods for text classification. We first introduce Compressed Phrase Embedding (ComPhE) by integrating syntactic parsing and semantic contextual information. We test those with two types of quantum-like approaches, 1) quantum-like input processing (DisCoWord) and 2) quantum-like attention (QSA), and discuss the contribution of compressed phrase syntactic and semantic integration towards the model performance on different text classification benchmarks.
Paper Type: short
Research Area: Resources and Evaluation
Languages Studied: English
Consent To Share Submission Details: On behalf of all authors, we agree to the terms above to share our submission details.
0 Replies
Loading