ChatPaper.aiChatPaper

GAS:通过广义对抗求解器优化扩散常微分方程的离散化

GAS: Improving Discretization of Diffusion ODEs via Generalized Adversarial Solver

October 20, 2025
作者: Aleksandr Oganov, Ilya Bykov, Eva Neudachina, Mishan Aliev, Alexander Tolmachev, Alexander Sidorov, Aleksandr Zuev, Andrey Okhotin, Denis Rakitin, Aibek Alanov
cs.AI

摘要

尽管扩散模型在生成质量上达到了业界领先水平,但其采样过程仍面临计算成本高昂的问题。近期研究通过基于梯度的优化方法应对这一挑战,这些方法从完整的采样过程中提炼出只需几步的常微分方程(ODE)扩散求解器,从而将函数评估次数从数十次大幅减少至仅几次。然而,这些方法往往依赖于复杂的训练技巧,并未明确聚焦于保留细粒度细节。本文提出了一种广义求解器:一种无需额外训练技巧的ODE采样器参数化方案,其质量优于现有方法。我们进一步将原始蒸馏损失与对抗训练相结合,有效减少了伪影并提升了细节保真度。我们将这一方法命名为广义对抗求解器,并在相似资源限制下,展示了其相较于现有求解器训练方法的卓越性能。代码已发布于https://github.com/3145tttt/GAS。
English
While diffusion models achieve state-of-the-art generation quality, they still suffer from computationally expensive sampling. Recent works address this issue with gradient-based optimization methods that distill a few-step ODE diffusion solver from the full sampling process, reducing the number of function evaluations from dozens to just a few. However, these approaches often rely on intricate training techniques and do not explicitly focus on preserving fine-grained details. In this paper, we introduce the Generalized Solver: a simple parameterization of the ODE sampler that does not require additional training tricks and improves quality over existing approaches. We further combine the original distillation loss with adversarial training, which mitigates artifacts and enhances detail fidelity. We call the resulting method the Generalized Adversarial Solver and demonstrate its superior performance compared to existing solver training methods under similar resource constraints. Code is available at https://github.com/3145tttt/GAS.
PDF202October 22, 2025