Keywords: MDL, neural networks, compression, prequential
TL;DR: In this paper we develop an alternative coding system for neural network description lengths which uses noisy datasets as intermediate steps and demonstrates future potential to challenge the dominant prequential approach.
Abstract: The minimum description length (MDL) principle has a rich history of informing neural network research and there are numerous algorithms for developing efficient neural network description lengths.
Of these methods, prequential coding, based on the prequential approach to statistics, has proven to be highly successful.
Despite its achievements, general prequential coding limits learning at each increment to a prefix of a given dataset - a constraint which is potentially misaligned with an effective learning process.
In this paper we introduce prechastic coding, an alternative to the prequential approach which is based on a guided, noisy sequence of intermediate learning steps.
In our experiments we determine that the prechastic coding can challenge prequential coding in certain scenarios, whilst also leaving significant potential for further improvement.
Submission Number: 14
Loading