Learning Recurrent Representations for Hierarchical Behavior Modeling

Eyrun Eyjolfsdottir, Kristin Branson, Yisong Yue, Pietro Perona

Nov 03, 2016 (modified: Mar 05, 2017) ICLR 2017 conference submission readers: everyone
  • Abstract: We propose a framework for detecting action patterns from motion sequences and modeling the sensory-motor relationship of animals, using a generative recurrent neural network. The network has a discriminative part (classifying actions) and a generative part (predicting motion), whose recurrent cells are laterally connected, allowing higher levels of the network to represent high level behavioral phenomena. We test our framework on two types of tracking data, fruit fly behavior and online handwriting. Our results show that 1) taking advantage of unlabeled sequences, by predicting future motion, significantly improves action detection performance when training labels are scarce, 2) the network learns to represent high level phenomena such as writer identity and fly gender, without supervision, and 3) simulated motion trajectories, generated by treating motion prediction as input to the network, look realistic and may be used to qualitatively evaluate whether the model has learnt generative control rules.
  • Keywords: Unsupervised Learning, Semi-Supervised Learning, Reinforcement Learning, Applications
  • Conflicts: caltech.edu, janelia.hhmi.org, disneyresearch.com, cmu.edu, cornell.edu