A cross-temporal contrastive disentangled model for ancient Chinese understanding

Published: 01 Jan 2024, Last Modified: 20 May 2025Neural Networks 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Highlights•Novel ancient Chinese pre-training framework based on cross-temporal property.•Cross-temporal decoupling strategy for semantics and syntax is proposed.•Experiments on 6 ancient Chinese understanding tasks demonstrate effectiveness.•Provides new insights for other languages that have undergone evolutionary change.
Loading