ChatPaper.aiChatPaper

人工海馬網路:高效長上下文建模

Artificial Hippocampus Networks for Efficient Long-Context Modeling

October 8, 2025
作者: Yunhao Fang, Weihao Yu, Shu Zhong, Qinghao Ye, Xuehan Xiong, Lai Wei
cs.AI

摘要

長序列建模面臨著一個根本性的權衡:一方面,類似RNN的模型通過固定大小的壓縮記憶體實現高效性;另一方面,基於注意力機制的Transformer模型則通過無損增長的記憶體保持高保真度。受認知科學中的多存儲模型啟發,我們提出了一種人工神經網路的記憶框架。該方法將Transformer的鍵值(KV)緩存作為無損的短期記憶,維持一個滑動窗口,同時通過一個可學習的模組——人工海馬網路(Artificial Hippocampus Network, AHN)——將窗口外的資訊遞歸壓縮為固定大小的緊湊長期記憶。為驗證這一框架,我們使用現代RNN架構(包括Mamba2、DeltaNet和Gated DeltaNet)實例化了AHN。在長上下文基準測試LV-Eval和InfiniteBench上的大量實驗表明,配備AHN的模型始終優於滑動窗口基線,並達到與全注意力模型相當甚至更優的性能,同時大幅降低了計算和記憶體需求。例如,在Qwen2.5-3B-Instruct模型中引入AHN,將推理FLOPs減少了40.5%,記憶體緩存減少了74.0%,同時其在LV-Eval(128k序列長度)上的平均得分從4.41提升至5.88。代碼已開源於:https://github.com/ByteDance-Seed/AHN。
English
Long-sequence modeling faces a fundamental trade-off between the efficiency of compressive fixed-size memory in RNN-like models and the fidelity of lossless growing memory in attention-based Transformers. Inspired by the Multi-Store Model in cognitive science, we introduce a memory framework of artificial neural networks. Our method maintains a sliding window of the Transformer's KV cache as lossless short-term memory, while a learnable module termed Artificial Hippocampus Network (AHN) recurrently compresses out-of-window information into a fixed-size compact long-term memory. To validate this framework, we instantiate AHNs using modern RNN-like architectures, including Mamba2, DeltaNet, and Gated DeltaNet. Extensive experiments on long-context benchmarks LV-Eval and InfiniteBench demonstrate that AHN-augmented models consistently outperform sliding window baselines and achieve performance comparable or even superior to full-attention models, while substantially reducing computational and memory requirements. For instance, augmenting the Qwen2.5-3B-Instruct with AHNs reduces inference FLOPs by 40.5% and memory cache by 74.0%, while improving its average score on LV-Eval (128k sequence length) from 4.41 to 5.88. Code is available at: https://github.com/ByteDance-Seed/AHN.
PDF222October 9, 2025