Keywords: test-time adaptation, virtual try-on, source-parameter-free adaptaion
TL;DR: The first test-time adaptation method for virtual try-on that guides diffusion models with statistical losses to preserve garment details across domain shifts without retraining.
Abstract: The rapid growth of e-commerce has driven notable advancements in diffusion-based virtual try-on models. Virtual try-on models, however, suffer significant quality degradation when deployed on real-world data that differs from their source (training) distribution. To address challenges in quality degradation due to domain shifts, we introduce a test-time adaptation framework that enhances try-on quality during diffusion denoising (inference time) without requiring model retraining or updates to the original network parameters. We introduce statistical distribution matching across complementary domains during the diffusion denoising process. Comprehensive evaluation across four state-of-the-art diffusion models (IDM-VTON, LaDI-VTON, Stable-VTON, TPD) and three datasets (VITON-HD, DressCode, DeepFashion) demonstrates notable improvements across multiple dataset-method combinations, with sharpness gains averaging 7.74\% and distortion reduction of 0.95\%. Our approach addresses important practical challenges in commercial virtual try-on deployment, enabling quality improvements across diverse domain conditions while preserving the original model's capabilities.
Supplementary Material: zip
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Submission Number: 7395
Loading