Abstract: Continuous-time liquid neural networks constitute a novel class of machine learning models that emulate the dynamics of biological neurons and synapses using ordinary differential equations. Despite their promising applications in predicting spatiotemporal dynamics, the adoption of these models has been constrained by their reliance on computationally expensive numerical differential equation solvers or approximate solutions. In this work, we propose a redefinition of the network's core neuron that accommodates for multiple presynaptic connections. We then derive a closed-form solution and present an implementation through a computationally efficient recursive algorithm. Our solution is validated both at the level of individual neurons and within a deep neural network architecture. Experimental results on sequential modeling tasks with image, sensor or medical data, demonstrate improved performance compared to state-of-the-art numerical and approximate methods.
Loading