Compressing Encoding of Words for use in Character-level Convolutional Networks for Text Classification

Anonymous

Nov 03, 2017 (modified: Nov 03, 2017) ICLR 2018 Conference Blind Submission readers: everyone Show Bibtex
  • Abstract: This paper reports the empirical exploration of a novel approach to encode words using compressing inspired techniques for use on convolutional neural networks at character-level. This approach reduces drastically the number of parameters to be optimized using deep learning, resulting in competitive accuracies in only a fraction of the time spent by characters level convolutional neural networks, enabling training in simpler hardware.
  • TL;DR: Using Compressing tecniques to Encoding of Words is a possibility for faster training of CNN and dimensionality reduction of representation
  • Keywords: Character Level Convolutional Networks, Text Classification, Word Compressing

Loading