Abstract: Neural networks often excel in short-horizon tasks, but their long-term reliability is less assured. We demonstrate that even a minimal architecture, trained on near-periodic data, can exhibit hyperbolic chaotic behavior after a small parameter perturbation. Drawing on classical dynamical systems — especially Lyapunov exponents and structural stability — we show that borderline-zero exponents do not shield multi-step forecasts from instability when genuine structural stability is absent. A weight change on the order of 10−3<math><mrow is="true"><mn is="true">1</mn><msup is="true"><mrow is="true"><mn is="true">0</mn></mrow><mrow is="true"><mo is="true">−</mo><mn is="true">3</mn></mrow></msup></mrow></math> can radically alter long-horizon forecasting, contradicting the notion that strong local metrics ensure global robustness. We propose a simple “pinning” strategy to curb runaway expansions by constraining certain outputs, yet borderline orbits remain a common pitfall in larger networks. Our findings underscore that short-horizon validation may fail to detect critical multi-step vulnerabilities, and that global diagnostics alongside structural stability are essential for reliable long-term forecasting.
External IDs:dblp:journals/nn/LuoCZ25
Loading