改进生成过程的重启抽样
Restart Sampling for Improving Generative Processes
June 26, 2023
作者: Yilun Xu, Mingyang Deng, Xiang Cheng, Yonglong Tian, Ziming Liu, Tommi Jaakkola
cs.AI
摘要
涉及解决微分方程的生成过程,如扩散模型,通常需要平衡速度和质量。基于ODE的采样器速度快,但性能会达到平台期,而基于SDE的采样器在增加采样时间的代价下提供更高的样本质量。我们将这种差异归因于采样误差:ODE采样器涉及较小的离散化误差,而SDE中的随机性会收缩积累的误差。基于这些发现,我们提出了一种名为“重启”的新型采样算法,以更好地平衡离散化误差和收缩。该采样方法在额外的前向步骤中交替添加大量噪声,严格遵循反向ODE。从经验上看,“重启”采样器在速度和准确性上均超过了先前的SDE和ODE采样器。在CIFAR-10 / ImageNet 64×64上,“重启”不仅优于先前最佳的SDE结果,而且将采样速度提高了10倍/2倍。此外,在可比较的采样时间内,“重启”比ODE采样器获得了显着更好的样本质量。此外,“重启”在大规模文本到图像的稳定扩散模型中更好地平衡了文本图像对齐/视觉质量与多样性,该模型在LAION 512×512上进行了预训练。代码可在https://github.com/Newbeeer/diffusion_restart_sampling找到。
English
Generative processes that involve solving differential equations, such as
diffusion models, frequently necessitate balancing speed and quality. ODE-based
samplers are fast but plateau in performance while SDE-based samplers deliver
higher sample quality at the cost of increased sampling time. We attribute this
difference to sampling errors: ODE-samplers involve smaller discretization
errors while stochasticity in SDE contracts accumulated errors. Based on these
findings, we propose a novel sampling algorithm called Restart in order to
better balance discretization errors and contraction. The sampling method
alternates between adding substantial noise in additional forward steps and
strictly following a backward ODE. Empirically, Restart sampler surpasses
previous SDE and ODE samplers in both speed and accuracy. Restart not only
outperforms the previous best SDE results, but also accelerates the sampling
speed by 10-fold / 2-fold on CIFAR-10 / ImageNet 64 times 64. In addition,
it attains significantly better sample quality than ODE samplers within
comparable sampling times. Moreover, Restart better balances text-image
alignment/visual quality versus diversity than previous samplers in the
large-scale text-to-image Stable Diffusion model pre-trained on LAION 512
times 512. Code is available at
https://github.com/Newbeeer/diffusion_restart_sampling