ChatPaper.aiChatPaper

ReSWD:ReSTIR之精粹,非搖晃之果。結合儲存庫採樣與切片瓦瑟斯坦距離以降低方差

ReSWD: ReSTIR'd, not shaken. Combining Reservoir Sampling and Sliced Wasserstein Distance for Variance Reduction

October 1, 2025
作者: Mark Boss, Andreas Engelhardt, Simon Donné, Varun Jampani
cs.AI

摘要

分佈匹配是許多視覺與圖形處理任務的核心,其中廣泛應用的Wasserstein距離在處理高維分佈時計算成本過高。切片Wasserstein距離(SWD)提供了一種可擴展的替代方案,但其蒙特卡羅估計器存在高方差問題,導致梯度噪聲大且收斂速度慢。我們提出了Reservoir SWD(ReSWD),該方法將加權Reservoir採樣整合到SWD中,以在優化步驟中自適應地保留信息豐富的投影方向,從而實現穩定的梯度,同時保持無偏性。在合成基準測試及實際任務(如色彩校正和擴散引導)上的實驗表明,ReSWD在性能上始終優於標準SWD及其他方差減少的基線方法。項目頁面:https://reservoirswd.github.io/
English
Distribution matching is central to many vision and graphics tasks, where the widely used Wasserstein distance is too costly to compute for high dimensional distributions. The Sliced Wasserstein Distance (SWD) offers a scalable alternative, yet its Monte Carlo estimator suffers from high variance, resulting in noisy gradients and slow convergence. We introduce Reservoir SWD (ReSWD), which integrates Weighted Reservoir Sampling into SWD to adaptively retain informative projection directions in optimization steps, resulting in stable gradients while remaining unbiased. Experiments on synthetic benchmarks and real-world tasks such as color correction and diffusion guidance show that ReSWD consistently outperforms standard SWD and other variance reduction baselines. Project page: https://reservoirswd.github.io/
PDF22October 2, 2025