OneFit: Unified Neural Garment Simulation using Function-based Representation and Learning

27 Sept 2024 (modified: 26 Nov 2024)ICLR 2025 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Garment draping, Unsupervised learning, Neural simulation, Virtual try-on
Abstract: The digital garment modeling using self-supervised learning has significantly evolved in terms of the speed and visual quality of garment deformation simulations. Recent advances have incorporated size-awareness which allows to drape garments realistically, by stretching only to avoid collisions with the human body. It allows their deployment into virtual try-on systems where the goal is to observe garment fitting. However, a major-shortcoming is that they learn mesh-specific models which requires a distinct model to be trained for each mesh representations of a given garment. In this paper, we introduce a novel self-supervised garment simulation approach to learn garment deformations using only functions. First, our PolyFit module converts the garment mesh patches into functions which allows a compact yet detail-preserving representation. Then, OneFit learns the deformations of these patches by restricting the space of the PolyFit function transformations conditioned on different body poses, in a physics-guided and an intrinsic geometry-aware manner. It not only extends to various mesh-representations of a given garment but also to diverse representations of a garment type. Hence, a model trained on single garment can generalise across several garment types. Thanks to its compact representation, it is computationally superior to its counterparts, in terms of both training and inference and scales well to unseen garments. Thus, by training OneFit on a set of garments, a mesh-agnostic, garment-agnostic deformation model can be learnt which can be finetuned (or postprocessed) to accommodate unseen garment types.
Supplementary Material: zip
Primary Area: applications to computer vision, audio, language, and other modalities
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 11766
Loading