ChatPaper.aiChatPaper

逆向流动:通过反向表示对齐改进标准化流

Flowing Backwards: Improving Normalizing Flows via Reverse Representation Alignment

November 27, 2025
作者: Yang Chen, Xiaowei Xu, Shuai Wang, Chenhui Zhu, Ruxue Wen, Xubin Li, Tiezheng Ge, Limin Wang
cs.AI

摘要

标准化流(Normalizing Flows, NFs)是一类具有数学可逆架构的生成模型,其前向传播将数据转换至隐空间进行密度估计,反向传播则从该空间生成新样本。这一特性在表示学习与数据生成之间建立了内在协同机制。然而,标准NF的生成质量受限于对数似然优化所得的语义表示质量不足。为此,我们提出一种新颖的对齐策略,创造性利用NF的可逆性:通过将生成过程(反向传播)的中间特征与强大视觉基础模型的表示相对齐,而非传统的前向传播正则化方法,证明了该方法相较于简单对齐的卓越有效性。我们还引入了一种无需训练、可在测试时优化的分类算法,为NF内嵌的语义知识提供了更本质的评估方式。综合实验表明,我们的方法将NF训练速度提升3.3倍以上,同时在生成质量与分类精度上均实现显著提升,在ImageNet 64×64和256×256数据集上创造了NF领域的最新性能纪录。代码已开源:https://github.com/MCG-NJU/FlowBack。
English
Normalizing Flows (NFs) are a class of generative models distinguished by a mathematically invertible architecture, where the forward pass transforms data into a latent space for density estimation, and the reverse pass generates new samples from this space. This characteristic creates an intrinsic synergy between representation learning and data generation. However, the generative quality of standard NFs is limited by poor semantic representations from log-likelihood optimization. To remedy this, we propose a novel alignment strategy that creatively leverages the invertibility of NFs: instead of regularizing the forward pass, we align the intermediate features of the generative (reverse) pass with representations from a powerful vision foundation model, demonstrating superior effectiveness over naive alignment. We also introduce a novel training-free, test-time optimization algorithm for classification, which provides a more intrinsic evaluation of the NF's embedded semantic knowledge. Comprehensive experiments demonstrate that our approach accelerates the training of NFs by over 3.3times, while simultaneously delivering significant improvements in both generative quality and classification accuracy. New state-of-the-art results for NFs are established on ImageNet 64times64 and 256times256. Our code is available at https://github.com/MCG-NJU/FlowBack.
PDF91December 5, 2025