A cross-temporal contrastive disentangled model for ancient Chinese understanding

Published: 2024, Last Modified: 06 Jan 2026Neural Networks 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Highlights•Novel ancient Chinese pre-training framework based on cross-temporal property.•Cross-temporal decoupling strategy for semantics and syntax is proposed.•Experiments on 6 ancient Chinese understanding tasks demonstrate effectiveness.•Provides new insights for other languages that have undergone evolutionary change.
Loading