深度残差回声状态网络:探索未经训练的循环神经网络中的残差正交连接
Deep Residual Echo State Networks: exploring residual orthogonal connections in untrained Recurrent Neural Networks
August 28, 2025
作者: Matteo Pinna, Andrea Ceni, Claudio Gallicchio
cs.AI
摘要
回声状态网络(Echo State Networks, ESNs)是储层计算(Reservoir Computing, RC)框架下一种特定类型的未经训练循环神经网络(Recurrent Neural Networks, RNNs),因其快速高效的学习能力而广受欢迎。然而,传统ESN在处理长期信息时往往表现欠佳。本文提出了一种基于时间残差连接的新型深度未经训练RNN,称为深度残差回声状态网络(Deep Residual Echo State Networks, DeepResESNs)。研究表明,利用多层未经训练的残差循环层显著提升了记忆容量和长期时间建模能力。针对时间残差连接,我们探讨了不同的正交配置,包括随机生成和固定结构的配置,并研究了它们对网络动态的影响。通过深入的数学分析,我们明确了确保DeepResESN内部动态稳定的必要和充分条件。在多种时间序列任务上的实验验证了所提方法相较于传统浅层及深层RC的优势。
English
Echo State Networks (ESNs) are a particular type of untrained Recurrent
Neural Networks (RNNs) within the Reservoir Computing (RC) framework, popular
for their fast and efficient learning. However, traditional ESNs often struggle
with long-term information processing. In this paper, we introduce a novel
class of deep untrained RNNs based on temporal residual connections, called
Deep Residual Echo State Networks (DeepResESNs). We show that leveraging a
hierarchy of untrained residual recurrent layers significantly boosts memory
capacity and long-term temporal modeling. For the temporal residual
connections, we consider different orthogonal configurations, including
randomly generated and fixed-structure configurations, and we study their
effect on network dynamics. A thorough mathematical analysis outlines necessary
and sufficient conditions to ensure stable dynamics within DeepResESN. Our
experiments on a variety of time series tasks showcase the advantages of the
proposed approach over traditional shallow and deep RC.