Dynamic Bi-Elman Attention Networks: A Dual-Directional Context-Aware Test-Time Learning for Text Classification

Published: 01 Jan 2025, Last Modified: 17 Sept 2025CoRR 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Text classification, a fundamental task in natural language processing, aims to categorize textual data into predefined labels. Traditional methods struggled with complex linguistic structures and semantic dependencies. However, the advent of deep learning, particularly recurrent neural networks and Transformer-based models, has significantly advanced the field by enabling nuanced feature extraction and context-aware predictions. Despite these improvements, existing models still exhibit limitations in balancing interpretability, computational efficiency, and long-range contextual understanding. To address these challenges, this paper proposes the Dynamic Bidirectional Elman with Attention Network (DBEAN). DBEAN integrates bidirectional temporal modeling with self-attention mechanisms. It dynamically assigns weights to critical segments of input, improving contextual representation while maintaining computational efficiency.
Loading