Real Time Stable Haptic Rendering Of 3D Deformable Streaming SurfaceDownload PDFOpen Website

2017 (modified: 02 Feb 2022)MMSys 2017Readers: Everyone
Abstract: In recent years, many researches are focusing on the haptic interaction with streaming data like RGBD video / point cloud stream captured by commodity depth sensors. Most previous methods use partial streaming data from depth sensors and only investigate haptic rendering of the rigid surface without complex physics simulation. Many virtual reality and tele-immersive applications such as medical training, and art designing require the complete scene and physics simulation. In this paper, we propose a stable haptic rendering method capable of interacting with streaming deformable surface in real-time. Our method applies KinectFusion for real-time reconstruction of real-world object surface instead of incomplete surface. While construction, it simultaneously uses hierarchical shape matching (HSM) method to simulate the surface deformation in haptic-enabled interaction. We have demonstrated how to combine the fusion and physics simulation of deformation together, and proposed a continuous collision detection method based on Truncated Signed Distance Function (TSDF). Furthermore, we propose a fast TSDF warping method to update the deformation to TSDF, and a proxy finding method to find the proxy position. The proposed method is able to simulate the haptic-enabled deformation of the 3D fusion surface. Therefore it provides a novel haptic interaction for virtual reality and 3D tele-immersive applications. Experimental results show that the proposed approach provides stable haptic rendering and fast simulation of 3D deformable surface.
0 Replies

Loading