ChatPaper.aiChatPaper

重探循环神经网络中的双线性状态转移机制

Revisiting Bi-Linear State Transitions in Recurrent Neural Networks

May 27, 2025
作者: M. Reza Ebrahimi, Roland Memisevic
cs.AI

摘要

在循環神經網絡中,隱藏單元的角色通常被視為模擬記憶,研究重點在於通過門控機制增強信息保留。一個較少探討的觀點則將隱藏單元視為網絡執行計算過程中的主動參與者,而非被動的記憶存儲。在本研究中,我們重新審視了雙線性操作,這些操作涉及隱藏單元與輸入嵌入之間的乘法交互。我們從理論和實證上證明,它們構成了表示狀態追蹤任務中隱藏狀態演化的自然歸納偏置。這些任務是最簡單的類型,要求隱藏單元積極貢獻於網絡的行為。我們還展示了雙線性狀態更新形成了一個自然層次結構,對應於複雜度遞增的狀態追蹤任務,而像Mamba這樣的流行線性循環網絡則位於該層次結構中複雜度最低的中心位置。
English
The role of hidden units in recurrent neural networks is typically seen as modeling memory, with research focusing on enhancing information retention through gating mechanisms. A less explored perspective views hidden units as active participants in the computation performed by the network, rather than passive memory stores. In this work, we revisit bi-linear operations, which involve multiplicative interactions between hidden units and input embeddings. We demonstrate theoretically and empirically that they constitute a natural inductive bias for representing the evolution of hidden states in state tracking tasks. These are the simplest type of task that require hidden units to actively contribute to the behavior of the network. We also show that bi-linear state updates form a natural hierarchy corresponding to state tracking tasks of increasing complexity, with popular linear recurrent networks such as Mamba residing at the lowest-complexity center of that hierarchy.
PDF42June 2, 2025