Abstract: Highlights•“Sensitivity” is a local version of the maximum Lyapunov exponent for each neuron.•Log sensitivity is equivalent to the maximum Lyapunov exponent until the dynamics reach the “edge of chaos”.•Sensitivity Adjustment Learning (SAL) adjusts the sensitivity in each neuron and realizes “edge of chaos” in the network as a result.•SAL also prevents “vanishing gradient” in gradient-based learning such as BP or BPTT, which greatly improves learning performance.•Compared with the adjustment of weight matrix, SAL can consider non-linearity and prevent the loss of sensitivity caused by another learning.
Loading