Deployment Characterization of Onboard Computing Platforms for Underwater Visual Inference

Published: 27 Apr 2026, Last Modified: 27 Apr 2026MaCVi PosterEveryoneRevisionsCC BY 4.0
Keywords: Underwater Robotics, Underwater Visual Inference, Edge AI, Onboard Deployment, Mission Profiles, Energy Efficiency
TL;DR: We characterize how Tiny, Mobile, and Accelerated Edge platforms trade off throughput, accuracy, and energy for underwater visual inference, showing that deployment suitability depends on execution path and mission profile.
Abstract: Underwater visual inference is increasingly used for robotic inspection and monitoring, but onboard deployment is constrained by tight energy budgets, limited internal volume, and platform-dependent runtime support. This paper characterizes deployment behavior for underwater visual inference across three representative onboard computing classes: Tiny Edge, Mobile Edge, and Accelerated Edge. Using a real-world underwater dataset, we evaluate lightweight classification and detection models under a common resolution-constrained operating regime with platform-supported runtimes and precision modes, comparing accuracy, throughput, and energy consumption. The results show that deployment efficiency is not determined by hardware class alone. Under the evaluated lightweight workload, the Mobile Edge class achieved a more favorable throughput-energy operating point than the Accelerated Edge class, consistent with accelerator-related overheads reducing the benefit of hardware acceleration when workload size is small. The Tiny Edge class, despite its low instantaneous power, incurred the highest energy per frame when inference latency dominated, but became the most energy-efficient option in an intermittent monitoring scenario where idle energy governed total mission cost. These findings indicate that underwater visual deployment should be selected according to workload characteristics, execution path, and mission profile, rather than peak hardware capability alone.
Email Sharing: We authorize the sharing of all author emails with Program Chairs.
Data Release: We authorize the release of our submission and author names to the public in the event of acceptance.
Submission Number: 9
Loading