ChatPaper.aiChatPaper

HiGS:基于历史引导采样的即插即用扩散模型增强方法

HiGS: History-Guided Sampling for Plug-and-Play Enhancement of Diffusion Models

September 26, 2025
作者: Seyedmorteza Sadat, Farnood Salehi, Romann M. Weber
cs.AI

摘要

尽管扩散模型在图像生成领域取得了显著进展,但其输出仍可能显得不够真实且缺乏精细细节,尤其是在使用较少的神经网络函数评估次数(NFEs)或较低的引导尺度时。为解决这一问题,我们提出了一种新颖的基于动量的采样技术,称为历史引导采样(HiGS),该技术通过将最近的模型预测整合到每个推理步骤中,提升了扩散采样的质量和效率。具体而言,HiGS利用当前预测与过去预测的加权平均值之间的差异,引导采样过程生成更加真实、细节和结构更优的输出。我们的方法几乎不引入额外计算,并能无缝集成到现有的扩散框架中,无需额外训练或微调。大量实验表明,HiGS在不同模型和架构下,以及在不同采样预算和引导尺度条件下,均能持续提升图像质量。此外,使用预训练的SiT模型,HiGS在无引导的256×256 ImageNet生成任务中,仅需30个采样步骤(而非标准的250步),便实现了1.61的最新FID记录。因此,我们提出HiGS作为一种即插即用的标准扩散采样增强方法,能够实现更快且更高保真度的图像生成。
English
While diffusion models have made remarkable progress in image generation, their outputs can still appear unrealistic and lack fine details, especially when using fewer number of neural function evaluations (NFEs) or lower guidance scales. To address this issue, we propose a novel momentum-based sampling technique, termed history-guided sampling (HiGS), which enhances quality and efficiency of diffusion sampling by integrating recent model predictions into each inference step. Specifically, HiGS leverages the difference between the current prediction and a weighted average of past predictions to steer the sampling process toward more realistic outputs with better details and structure. Our approach introduces practically no additional computation and integrates seamlessly into existing diffusion frameworks, requiring neither extra training nor fine-tuning. Extensive experiments show that HiGS consistently improves image quality across diverse models and architectures and under varying sampling budgets and guidance scales. Moreover, using a pretrained SiT model, HiGS achieves a new state-of-the-art FID of 1.61 for unguided ImageNet generation at 256times256 with only 30 sampling steps (instead of the standard 250). We thus present HiGS as a plug-and-play enhancement to standard diffusion sampling that enables faster generation with higher fidelity.
PDF22September 29, 2025