深度殘差回聲狀態網絡:探索未經訓練的循環神經網絡中的殘差正交連接
Deep Residual Echo State Networks: exploring residual orthogonal connections in untrained Recurrent Neural Networks
August 28, 2025
作者: Matteo Pinna, Andrea Ceni, Claudio Gallicchio
cs.AI
摘要
迴聲狀態網絡(Echo State Networks, ESNs)是儲備計算(Reservoir Computing, RC)框架下的一類特殊未訓練循環神經網絡(Recurrent Neural Networks, RNNs),以其快速高效的學習能力而廣受歡迎。然而,傳統的ESNs在長期信息處理方面往往表現欠佳。本文中,我們提出了一種基於時間殘差連接的新型深度未訓練RNNs,稱為深度殘差迴聲狀態網絡(Deep Residual Echo State Networks, DeepResESNs)。我們證明,利用未訓練的殘差循環層次結構能顯著提升記憶容量及長期時間建模能力。針對時間殘差連接,我們探討了不同的正交配置,包括隨機生成與固定結構配置,並研究了它們對網絡動態的影響。通過詳盡的數學分析,我們概述了確保DeepResESN內部動態穩定的必要與充分條件。在多種時間序列任務上的實驗結果,展示了所提方法相較於傳統淺層及深度RC的優勢。
English
Echo State Networks (ESNs) are a particular type of untrained Recurrent
Neural Networks (RNNs) within the Reservoir Computing (RC) framework, popular
for their fast and efficient learning. However, traditional ESNs often struggle
with long-term information processing. In this paper, we introduce a novel
class of deep untrained RNNs based on temporal residual connections, called
Deep Residual Echo State Networks (DeepResESNs). We show that leveraging a
hierarchy of untrained residual recurrent layers significantly boosts memory
capacity and long-term temporal modeling. For the temporal residual
connections, we consider different orthogonal configurations, including
randomly generated and fixed-structure configurations, and we study their
effect on network dynamics. A thorough mathematical analysis outlines necessary
and sufficient conditions to ensure stable dynamics within DeepResESN. Our
experiments on a variety of time series tasks showcase the advantages of the
proposed approach over traditional shallow and deep RC.