ChatPaper.aiChatPaper

Frac-Connections:超连接的分形扩展

Frac-Connections: Fractional Extension of Hyper-Connections

March 18, 2025
作者: Defa Zhu, Hongzhi Huang, Jundong Zhou, Zihao Huang, Yutao Zeng, Banggu Wu, Qiyang Min, Xun Zhou
cs.AI

摘要

残差连接是现代深度学习架构的核心,通过缓解梯度消失问题,使得极深网络的训练成为可能。超连接技术近期对残差连接进行了泛化,通过在不同深度引入多种连接强度,有效解决了梯度消失与表征坍缩之间的跷跷板效应。然而,超连接通过扩展隐藏状态的宽度,增加了内存访问成本。本文提出了一种新颖的方法——分数连接,其核心思想是将隐藏状态分割为多个部分而非扩展其宽度。分数连接在保留超连接部分优势的同时,显著降低了内存消耗。为验证其有效性,我们在语言任务上进行了大规模实验,其中最大规模的实验是在高达3T词元上训练的7B混合专家模型,结果表明分数连接显著优于传统残差连接。
English
Residual connections are central to modern deep learning architectures, enabling the training of very deep networks by mitigating gradient vanishing. Hyper-Connections recently generalized residual connections by introducing multiple connection strengths at different depths, thereby addressing the seesaw effect between gradient vanishing and representation collapse. However, Hyper-Connections increase memory access costs by expanding the width of hidden states. In this paper, we propose Frac-Connections, a novel approach that divides hidden states into multiple parts rather than expanding their width. Frac-Connections retain partial benefits of Hyper-Connections while reducing memory consumption. To validate their effectiveness, we conduct large-scale experiments on language tasks, with the largest being a 7B MoE model trained on up to 3T tokens, demonstrating that Frac-Connections significantly outperform residual connections.

Summary

AI-Generated Summary

PDF214March 19, 2025