Keywords: Text Classification, Semantic Information Clustering, Recurrent Neural Network
Abstract: Models based on Recurrent Neural Networks (RNNs) have been widely employed for text classification tasks. Traditional RNNs primarily emphasize long-term memory capabilities. However, this approach does not fully align with human cognitive learning processes, particularly in the context of classification tasks. The human brain typically extracts essential information relevant to the classification categories, disregards irrelevant details, and compresses the input to accelerate decision-making. Inspired by this, we propose a novel architecture, the Fast Salient Factor Concentration (FSFC) RNN, specifically designed for classification tasks. FSFC dynamically clusters and compresses semantic information by leveraging the short-term memory capabilities of recurrent neural networks. Experimental results demonstrate that FSFC achieves performance comparable to existing RNNs, while significantly improving training efficiency in classification tasks. Based on the YelpReviewFull dataset, FSFC improves accuracy by 1.37% over Long Short-Term Memory (LSTM), while reducing training time by 86%. Additionally, we propose a new evaluation metric, E-score, which integrates both accuracy and time efficiency to comprehensively assess the overall performance of each network.
Primary Area: learning on time series and dynamical systems
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 11017
Loading