A Data-Driven Approach for the Localization of Interacting Agents via a Multi-Modal Dynamic Bayesian Network FrameworkDownload PDFOpen Website

Published: 01 Jan 2022, Last Modified: 05 Nov 2023AVSS 2022Readers: Everyone
Abstract: This paper proposes a multi-modal situational inter-action model for collaborative agents by fusing multi-sensorial information in a Multi-Agent Hierarchical Dynamic Bayesian Network (MAH-DBN) framework. The proposed model is learned in a data-driven methodology to estimate the states of interacting agents only from video sequences. This can be regarded as a two-fold methodology for improving visual-based localization and interaction between autonomous agents. In the learning stage, the odometry model is used to drive the video learning model for a robust localization and interaction modeling. During the testing phase, the learned Multi-Agent Hierarchical DBN (MAH-DBN) model is used for the localization of collaborative agents only from video sequences by proposing an inference method called Multi-Agent Coupled Markov Jump Particle Filter (MAC-MJPF).
0 Replies

Loading