Learning to be Multimodal : Co-evolving Sensory Modalities and Sensor PropertiesDownload PDF

Published: 01 Oct 2021, Last Modified: 05 May 2023CoRL 2021, Blue SkyReaders: Everyone
Keywords: Multimodal Sensing, Co-evolving Design
TL;DR: Our blue-sky view is that we could co-evolve sensor hardware design and the choice of sensory modalities together with the development of the learning approaches that need to learn to interpret this sensory data.
Abstract: Making a single sensory modality precise and robust enough to get human-level performance and autonomy could be very expensive or intractable. Fusing information from multiple sensory modalities is promising -- for example, recent works showed benefits from combining vision with haptic sensors or with audio data. Learning-based methods facilitate faster progress in this field by removing the need for manual feature engineering. However, the sensor properties and the choice of sensory modalities is still usually done manually. Our blue-sky view is that we could simulate/emulate sensors with various properties, then infer which properties and combinations of sensors yield the best learning outcomes. This view would incentivize the development of novel, affordable sensors that can make a noticeable impact on the performance, robustness and ease of training classifiers, models and policies for robotics. This would motivate making hardware that provides signals complementary to the existing ones. As a result: we can significantly expand the realm of applicability of the learning-based approaches.
3 Replies

Loading