Keywords: Time-cells, Memory, Cognitive Science, Architecture, Time-series, Recurrent Neural-Network
Abstract: Extracting temporal relationships over a range of scales is a hallmark of
human perception and cognition---and thus it is a critical feature of machine
learning applied to real-world problems. Neural networks are either plagued
by the exploding/vanishing gradient problem in recurrent neural networks
(RNNs) or must adjust their parameters to learn the relevant time scales
(e.g., in LSTMs). This paper introduces DeepSITH, a deep network comprising
biologically-inspired Scale-Invariant Temporal History (SITH) modules in
series with dense connections between layers. Each SITH module is simply a
set of time cells coding what happened when with a geometrically-spaced set of
time lags. The dense connections between layers change the definition of what
from one layer to the next. The geometric series of time lags implies that
the network codes time on a logarithmic scale, enabling DeepSITH network to
learn problems requiring memory over a wide range of time scales. We compare
DeepSITH to LSTMs and other recent RNNs on several time series prediction and
decoding tasks. DeepSITH achieves results comparable to state-of-the-art
performance on these problems and continues to perform well even as the delays
are increased.
Code Of Conduct: I certify that all co-authors of this work have read and commit to adhering to the NeurIPS Statement on Ethics, Fairness, Inclusivity, and Code of Conduct.
Supplementary Material: pdf
Code: https://github.com/compmem/DeepSITH
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/deepsith-efficient-learning-via-decomposition/code)
13 Replies
Loading