重温循环神经网络中的双线性状态转移机制
Revisiting Bi-Linear State Transitions in Recurrent Neural Networks
May 27, 2025
作者: M. Reza Ebrahimi, Roland Memisevic
cs.AI
摘要
在循环神经网络中,隐藏单元的角色通常被视为对记忆的建模,研究重点在于通过门控机制增强信息保留能力。一个较少被探讨的视角则认为,隐藏单元是网络执行计算过程中的积极参与者,而非被动的记忆存储。在本研究中,我们重新审视了双线性操作,这些操作涉及隐藏单元与输入嵌入之间的乘法交互。我们从理论和实证两方面证明,双线性操作构成了表示状态跟踪任务中隐藏状态演化的自然归纳偏置。这些任务是最简单的一类,要求隐藏单元主动参与网络行为的形成。我们还展示了双线性状态更新自然地形成了一个层次结构,对应于复杂度递增的状态跟踪任务,而像Mamba这样的流行线性循环网络则位于该层次结构的最低复杂度中心。
English
The role of hidden units in recurrent neural networks is typically seen as
modeling memory, with research focusing on enhancing information retention
through gating mechanisms. A less explored perspective views hidden units as
active participants in the computation performed by the network, rather than
passive memory stores. In this work, we revisit bi-linear operations, which
involve multiplicative interactions between hidden units and input embeddings.
We demonstrate theoretically and empirically that they constitute a natural
inductive bias for representing the evolution of hidden states in state
tracking tasks. These are the simplest type of task that require hidden units
to actively contribute to the behavior of the network. We also show that
bi-linear state updates form a natural hierarchy corresponding to state
tracking tasks of increasing complexity, with popular linear recurrent networks
such as Mamba residing at the lowest-complexity center of that hierarchy.Summary
AI-Generated Summary