Synthesis of Differentiable Functional Programs for Lifelong LearningDownload PDF

Anonymous

02 Jun 2018 (modified: 05 May 2023)Submitted to NAMPI 2018Readers: Everyone
Abstract: We present a {\em neurosymbolic approach} to the lifelong learning of algorithmic tasks that mix perception and procedural reasoning. Reusing high-level concepts across domains and learning complex procedures are key challenges in lifelong learning. We show that a combination of gradient-based learning and {symbolic program synthesis} can be a more effective response to these challenges than purely neural methods. Our approach, called \system, represents neural networks as strongly typed, end-to-end differentiable functional programs that use symbolic higher-order combinators to compose a library of neural functions. Our learning algorithm consists of: (1) a symbolic program synthesizer that performs a type-directed search over parameterized programs, and decides on the library functions to reuse, and the architectures to combine them, while learning a sequence of tasks; and (2) a neural module that trains these programs using stochastic gradient descent. Our experiments show that \system transfers high-level concepts more effectively than traditional transfer learning and progressive neural networks.
Keywords: program synthesis, lifelong learning
0 Replies

Loading