Abstract: Realistic garment simulation is critical for digital humans. However, noticeable penetrations still exist in current learning-based garment simulation techniques. To reduce penetrations in predicted garments, we resort to the garment geometry and neural Signed Distance Fields (SDFs) for effective collision handling. The key idea of our method is that we divide the garment into patches and model the local and global garment geometry through Intra- and Inter-Patch Correlations (IIPC), which can be easily learned through the powerful context-understanding ability of Transformers. The geometry information is then utilized to predict a per-vertex moving offset, according to which we move the penetrating vertices along the SDF’s gradient directions to solve collisions. Our module can be coupled with learning-based backbones to effectively solve penetrations while retaining real-time performance. Extensive experiments show that the proposed method excels the prior works significantly.
Loading