ChatPaper.aiChatPaper

OmniTry:无需遮罩的万物虚拟试穿

OmniTry: Virtual Try-On Anything without Masks

August 19, 2025
作者: Yutong Feng, Linlin Zhang, Hengyuan Cao, Yiming Chen, Xiaoduan Feng, Jian Cao, Yuxiong Wu, Bin Wang
cs.AI

摘要

虚拟试穿(VTON)是一项实用且广泛应用的任务,现有研究大多聚焦于服装领域。本文提出了OmniTry,一个统一的框架,将VTON从服装扩展至任何可穿戴物品,如珠宝和配饰,并采用无掩码设置以提升实际应用价值。在扩展到多种物品类型时,获取成对图像(即物品图像及其对应的试穿效果)的数据整理工作颇具挑战。为解决此问题,我们设计了一个两阶段流程:第一阶段,我们利用大规模非成对图像(即包含任意可穿戴物品的人物肖像)训练模型,实现无掩码定位。具体而言,我们改造了图像修复模型,使其能在给定空白掩码的情况下自动在合适位置绘制物品。第二阶段,模型通过成对图像进一步微调,以保持物品外观的一致性。我们观察到,经过第一阶段训练的模型即使在少量成对样本下也能快速收敛。OmniTry在一个包含12种常见可穿戴物品类别的综合基准上进行了评估,涵盖了店内和自然场景下的图像。实验结果表明,与现有方法相比,OmniTry在物品定位和身份保持方面均展现出更优性能。OmniTry的代码、模型权重及评估基准将公开发布于https://omnitry.github.io/。
English
Virtual Try-ON (VTON) is a practical and widely-applied task, for which most of existing works focus on clothes. This paper presents OmniTry, a unified framework that extends VTON beyond garment to encompass any wearable objects, e.g., jewelries and accessories, with mask-free setting for more practical application. When extending to various types of objects, data curation is challenging for obtaining paired images, i.e., the object image and the corresponding try-on result. To tackle this problem, we propose a two-staged pipeline: For the first stage, we leverage large-scale unpaired images, i.e., portraits with any wearable items, to train the model for mask-free localization. Specifically, we repurpose the inpainting model to automatically draw objects in suitable positions given an empty mask. For the second stage, the model is further fine-tuned with paired images to transfer the consistency of object appearance. We observed that the model after the first stage shows quick convergence even with few paired samples. OmniTry is evaluated on a comprehensive benchmark consisting of 12 common classes of wearable objects, with both in-shop and in-the-wild images. Experimental results suggest that OmniTry shows better performance on both object localization and ID-preservation compared with existing methods. The code, model weights, and evaluation benchmark of OmniTry will be made publicly available at https://omnitry.github.io/.
PDF81August 20, 2025