FlyView: a bio-informed optical flow truth dataset for visual navigation using panoramic stereo visionDownload PDF

Published: 17 Sept 2022, Last Modified: 23 May 2023NeurIPS 2022 Datasets and Benchmarks Readers: Everyone
Keywords: optical flow, motion flow, self-motion, ego-motion, fly, drosophila, calliphora
TL;DR: A dataset for motion flow and ego-motion inspired by the fly vision
Abstract: Flying at speed through complex environments is a challenging task that has been performed successfully by insects since the Carboniferous, but which remains a challenge for robotic and autonomous systems. Insects navigate the world using optical flow sensed by their compound eyes, which they process using a deep neural network weighing just a few milligrams. Deploying an insect-inspired network architecture in computer vision could therefore enable more efficient and effective ways of estimating structure and self-motion using optical flow. Training a bio-informed deep network to implement these tasks requires biologically relevant training, test, and validation data. To this end, we introduce FlyView, a novel bio-informed truth dataset for visual navigation. This simulated dataset is rendered using open source 3D scenes in which the observer's position is known at every frame, and is accompanied by truth data on depth, self-motion, and motion flow. This dataset comprising 42,475 frames has several key features that are missing from existing optical flow datasets, including: (i) panoramic cameras with a monocular and binocular field of view matched to that of a fly's compound eyes; (ii) dynamically meaningful self-motion modelled on motion primitives, or the 3D trajectories of drones and flies; and (iii) complex natural and indoor environments including reflective surfaces.
URL: https://github.com/Ahleroy/FlyView
Dataset Url: Data will be uploaded to HuggingFace, and download links will be shared on the Github repository. https://github.com/Ahleroy/FlyView For now, only the sample scene is available on the Github repository,
Dataset Embargo: The project is funded by DSTL and therefore the dataset needs to be approved before release. We will make sure the dataset will be fully released by the date of the conference. In addition, the dataset is roughly 4TB of data. Therefore, the release of the dataset will be done in chunks on multiple hosting platforms. A sample is made available to the reviewers.
License: 3D assets are the propriety of their authors. Please refer to the supplementary materials for the Licence of individual assets. Trajectories and code are the propriety of the University of Oxford and are generously made available to the scientific community under the license CC-BY-NC-SA 4.0. Source code is made available under the MIT license.
Author Statement: Yes
Supplementary Material: zip
Contribution Process Agreement: Yes
In Person Attendance: Yes
25 Replies

Loading