Open Peer Review. Open Publishing. Open Access. Open Discussion. Open Directory. Open Recommendations. Open API. Open Source.
Compressing Encoding of Words for use in Character-level Convolutional Networks for Text Classification
Nov 03, 2017 (modified: Nov 03, 2017)ICLR 2018 Conference Blind Submissionreaders: everyoneShow Bibtex
Abstract:This paper reports the empirical exploration of a novel approach to encode words using compressing inspired techniques for use on convolutional neural networks at character-level. This approach reduces drastically the number of parameters to be optimized using deep learning, resulting in competitive accuracies in only a fraction of the time spent by characters level convolutional neural networks, enabling training in simpler hardware.
TL;DR:Using Compressing tecniques to Encoding of Words is a possibility for faster training of CNN and dimensionality reduction of representation
Keywords:Character Level Convolutional Networks, Text Classification, Word Compressing
Enter your feedback below and we'll get back to you as soon as possible.