Keywords: Robots, Acoustic Noise, Vision, Learning
TL;DR: We propose Acoustic Noise Predictor (ANP) that learns how "loud" the robot’s actions will be for a listener in a home or an office.
Abstract: We propose Audio Noise Awareness using Visuals of Indoors for NAVIgation for quieter robot path planning. While humans are naturally aware of the noise they make and its impact on those around them, robots currently lack this awareness.
A key challenge in achieving audio awareness for robots is estimating how loud will the robot’s actions be at a listener’s location? Since sound depends upon the geometry and material composition of rooms, we train the robot to passively perceive loudness using visual observations of indoor environments. To this end, we generate data on how loud an `impulse' sounds at different listener locations in simulated homes, and train our Acoustic Noise Predictor (ANP). Next, we collect acoustic profiles corresponding to different actions for navigation. Unifying ANP with action acoustics, we demonstrate experiments with wheeled (Hello Robot Stretch) and legged (Unitree Go2) robots so that these robots adhere to the noise constraints of the environment. All simulated and real-world data, code and model checkpoints is released at https://anavi-corl24.github.io/.
Supplementary Material: zip
Video: https://youtu.be/Bq8UutU5JnA
Website: https://anavi-corl24.github.io/
Code: https://github.com/vidhiJain/anavi
Student Paper: yes
Spotlight Video: mp4
Submission Number: 401
Loading