FAN:傅立叶分析网络
FAN: Fourier Analysis Networks
October 3, 2024
作者: Yihong Dong, Ge Li, Yongding Tao, Xue Jiang, Kechi Zhang, Jia Li, Jing Su, Jun Zhang, Jingjing Xu
cs.AI
摘要
尽管神经网络,特别是MLP和Transformer所代表的网络取得了显著的成功,我们发现它们在建模和推理周期性方面存在潜在缺陷,即它们倾向于记忆周期性数据,而非真正理解周期性的基本原理。然而,周期性是各种形式推理和泛化的关键特征,在自然和工程系统中通过观察中的重复模式支撑可预测性。在本文中,我们提出了一种基于傅立叶分析的新型网络架构FAN,它赋予了有效建模和推理周期现象的能力。通过引入傅立叶级数,周期性被自然地整合到神经网络的结构和计算过程中,从而实现对周期模式更准确的表达和预测。作为多层感知器(MLP)的有希望替代,FAN可以在各种模型中无缝取代MLP,且具有更少的参数和FLOPs。通过大量实验证明了FAN在建模和推理周期函数方面的有效性,以及FAN在一系列现实任务中的优越性和泛化能力,包括符号公式表示、时间序列预测和语言建模。
English
Despite the remarkable success achieved by neural networks, particularly
those represented by MLP and Transformer, we reveal that they exhibit potential
flaws in the modeling and reasoning of periodicity, i.e., they tend to memorize
the periodic data rather than genuinely understanding the underlying principles
of periodicity. However, periodicity is a crucial trait in various forms of
reasoning and generalization, underpinning predictability across natural and
engineered systems through recurring patterns in observations. In this paper,
we propose FAN, a novel network architecture based on Fourier Analysis, which
empowers the ability to efficiently model and reason about periodic phenomena.
By introducing Fourier Series, the periodicity is naturally integrated into the
structure and computational processes of the neural network, thus achieving a
more accurate expression and prediction of periodic patterns. As a promising
substitute to multi-layer perceptron (MLP), FAN can seamlessly replace MLP in
various models with fewer parameters and FLOPs. Through extensive experiments,
we demonstrate the effectiveness of FAN in modeling and reasoning about
periodic functions, and the superiority and generalizability of FAN across a
range of real-world tasks, including symbolic formula representation, time
series forecasting, and language modeling.Summary
AI-Generated Summary