Neural Relational Inference with Fast Modular Meta-learningDownload PDF

Ferran Alet, Erica Weng, Tomas Lozano-Perez, Leslie Kaelbling

06 Sept 2019 (modified: 05 May 2023)NeurIPS 2019Readers: Everyone
Abstract: Graph neural networks (GNNs) are effective models for many dynamical systems consisting of entities and relations. Although most GNN applications assume a single type of entity and relation, many situations involve multiple types of interactions. Relational inference is the problem of inferring these interactions and learning the dynamics from observational data. We frame relational inference as a modular meta-learning problem, where neural modules are trained to be composed in different ways to solve many tasks. This framework allows us to implicitly encode time invariance, leading to more data efficiency, and infer relations in context of one another rather than independently, increasing inference capacity. Moreover, framing inference as the inner-loop optimization in a meta-learning setting allows us to estimate the state of entities that we do not observe directly, but whose presence we can infer through their effects on observed entities. To address the large search space of graph neural network compositions, we meta-learn a proposal function that speeds up the inner-loop simulated annealing search within the modular meta-learning algorithm, providing one or two orders of magnitude increase in the size of problems that can be addressed.
Code Link: https://github.com/FerranAlet/modular-metalearning
CMT Num: 6328
0 Replies

Loading