ChatPaper.aiChatPaper

并行循环变压器:实现高效测试时计算扩展

Parallel Loop Transformer for Efficient Test-Time Computation Scaling

October 28, 2025
作者: Bohong Wu, Mengzhao Chen, Xiang Luo, Shen Yan, Qifan Yu, Fan Xia, Tianqi Zhang, Hongrui Zhan, Zheng Zhong, Xun Zhou, Siyuan Qiao, Xingyan Bin
cs.AI

摘要

大型语言模型(LLM)虽功能强大,但在实际推理应用中常因速度过慢和成本过高受限。循环变压器通过在多轮计算步骤(即“循环”)中复用相同权重来节省参数量,但该方法存在明显缺陷:循环必须顺序执行,导致每增加一个循环都会加剧推理延迟和内存需求,因而难以应用于实时场景。为解决此问题,我们提出并行循环变压器(PLT)。这一新型架构既能保持深层循环模型的高性能,又可实现标准非循环模型的低延迟特性。PLT的核心技术包含两方面:首先,跨循环并行技术(CLP)通过单次前向传播同时计算不同词元的循环,打破顺序依赖;其次,为控制内存增长,采用高效表征增强策略——将首轮循环的键值缓存共享至所有后续循环,再通过门控滑动窗口注意力(G-SWA)将共享的全局信息与局部信息融合,从而保持高精度。实验表明,PLT在达到传统循环模型精度的同时,其延迟与内存开销与标准变压器几乎持平。
English
Large Language Models (LLMs) are powerful but often too slow and costly for real-world use during inference. Looped transformers save on parameters by reusing the same weights for multiple computational steps, or "loops." However, this approach has a major flaw: the loops run one after another, causing inference latency and memory requirements to increase with each added loop. This makes them impractical for fast applications. To solve this problem, we introduce the Parallel Loop Transformer (PLT). PLT is a new architecture that delivers the performance benefits of a deep, looped model but with the low latency of a standard, non-looped model. PLT works using two key techniques. First, Cross-Loop Parallelism (CLP) breaks the sequential dependency by computing different loops for different tokens at the same time, all within a single pass. Second, to prevent memory costs from growing, we use an Efficient Representation Enhancement strategy. This method shares the memory (KV cache) from the first loop with all other loops. It then uses a Gated Sliding-Window Attention (G-SWA) to combine this shared global information with local information, maintaining high accuracy. Our experiments show that PLT achieves the high accuracy of a traditional looped model but with almost no extra latency or memory cost compared to a standard transformer.
PDF154December 2, 2025