Quantum-Inspired Sentence Representation: Rethinking Word-Based Density Matrices

ACL ARR 2024 June Submission2406 Authors

15 Jun 2024 (modified: 19 Jul 2024)ACL ARR 2024 June SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract: This paper proposes a novel approach to enhance traditional quantum-inspired models. We introduce a Quantum-Inspired Sentence Representation model (QISR), which transforms word density matrices into representations of entire sentences, improving computational resource efficiency. Compared with traditional quantum-inspired models, the QISR method works at the density matrix layer and has better effects on the overall model as the embedding dimension increases. Even the QPDN model with a word embedding of 768 dimensions only requires 1736MB. This optimization has potential benefits for the overall model architecture, particularly when dealing with large word embedding dimensions. Furthermore, this approach reduces computing resource consumption while maintaining high computational accuracy, highlighting its potential benefits in processing complex language tasks. This research provides a novel approach to sentence representation in quantum-inspired language models and highlights the potential value of improved computational methods in a quantum-inspired context. Our research results are expected to provide modeling support and practical application guidance for future text processing endeavors.
Paper Type: Long
Research Area: Efficient/Low-Resource Methods for NLP
Research Area Keywords: parameter-efficient-training
Contribution Types: Approaches low compute settings-efficiency, Theory
Languages Studied: English
Submission Number: 2406
Loading