Rethinking Chinese Word Segmentation: Tokenization, Character Classification, or Wordbreak Identification

Abstract: Chu-Ren Huang, Petr Šimon, Shu-Kai Hsieh, Laurent Prévot. Proceedings of the 45th Annual Meeting of the Association for Computational Linguistics Companion Volume Proceedings of the Demo and Poster Sessions. 2007.
0 Replies
Loading