ChatPaper.aiChatPaper

OmniTry:無需遮罩的萬物虛擬試穿

OmniTry: Virtual Try-On Anything without Masks

August 19, 2025
作者: Yutong Feng, Linlin Zhang, Hengyuan Cao, Yiming Chen, Xiaoduan Feng, Jian Cao, Yuxiong Wu, Bin Wang
cs.AI

摘要

虛擬試穿(VTON)是一項實用且廣泛應用的任務,現有研究大多聚焦於服裝。本文提出OmniTry,這是一個統一框架,將VTON的應用範圍從服裝擴展至任何可穿戴物品,如珠寶和配飾,並採用無遮罩設置以實現更實際的應用。在擴展到多種類型的物品時,數據整理對於獲取配對圖像(即物品圖像及其對應的試穿效果)具有挑戰性。為解決這一問題,我們提出了一個兩階段流程:在第一階段,我們利用大規模非配對圖像(即包含任何可穿戴物品的人物肖像)來訓練模型,實現無遮罩定位。具體而言,我們重新利用修復模型,在給定空白遮罩的情況下自動在合適位置繪製物品。在第二階段,模型通過配對圖像進一步微調,以轉移物品外觀的一致性。我們觀察到,第一階段後的模型即使在少量配對樣本下也能快速收斂。OmniTry在一個包含12種常見可穿戴物品類別的綜合基準上進行了評估,涵蓋店內和野外圖像。實驗結果表明,與現有方法相比,OmniTry在物品定位和身份保持方面表現更佳。OmniTry的代碼、模型權重和評估基準將在https://omnitry.github.io/上公開提供。
English
Virtual Try-ON (VTON) is a practical and widely-applied task, for which most of existing works focus on clothes. This paper presents OmniTry, a unified framework that extends VTON beyond garment to encompass any wearable objects, e.g., jewelries and accessories, with mask-free setting for more practical application. When extending to various types of objects, data curation is challenging for obtaining paired images, i.e., the object image and the corresponding try-on result. To tackle this problem, we propose a two-staged pipeline: For the first stage, we leverage large-scale unpaired images, i.e., portraits with any wearable items, to train the model for mask-free localization. Specifically, we repurpose the inpainting model to automatically draw objects in suitable positions given an empty mask. For the second stage, the model is further fine-tuned with paired images to transfer the consistency of object appearance. We observed that the model after the first stage shows quick convergence even with few paired samples. OmniTry is evaluated on a comprehensive benchmark consisting of 12 common classes of wearable objects, with both in-shop and in-the-wild images. Experimental results suggest that OmniTry shows better performance on both object localization and ID-preservation compared with existing methods. The code, model weights, and evaluation benchmark of OmniTry will be made publicly available at https://omnitry.github.io/.
PDF81August 20, 2025