ChatPaper.aiChatPaper

功能连续分解

Functional Continuous Decomposition

February 24, 2026
作者: Teymur Aghayev
cs.AI

摘要

针对非平稳时间序列数据的分析需要深入理解其局部与全局特征,并具备物理解释性。然而传统平滑算法(如B样条、Savitzky-Golay滤波和经验模态分解)难以在保证连续性的前提下进行参数化优化。本文提出函数连续分解(FCD)——基于JAX加速的框架,可对多种数学函数执行参数化连续优化。该框架采用Levenberg-Marquardt优化算法实现最高C^1连续拟合,将原始时间序列数据转换为M个模态,从而捕获从短期波动到长期趋势的多种时序特征。FCD在物理、医学、金融分析和机器学习等领域具有广泛应用,常用于信号时序模式分析、优化参数提取以及分解结果的微分积分运算。实验表明,FCD在物理特征提取中平均分段标准化均方根误差为0.735,对1000点数据的完整分解仅需0.47秒。最终我们验证了卷积神经网络通过融合FCD特征(包括优化函数值、参数及微分结果)后,相比标准卷积神经网络收敛速度提升16.8%,识别准确率提高2.5%。
English
The analysis of non-stationary time-series data requires insight into its local and global patterns with physical interpretability. However, traditional smoothing algorithms, such as B-splines, Savitzky-Golay filtering, and Empirical Mode Decomposition (EMD), lack the ability to perform parametric optimization with guaranteed continuity. In this paper, we propose Functional Continuous Decomposition (FCD), a JAX-accelerated framework that performs parametric, continuous optimization on a wide range of mathematical functions. By using Levenberg-Marquardt optimization to achieve up to C^1 continuous fitting, FCD transforms raw time-series data into M modes that capture different temporal patterns from short-term to long-term trends. Applications of FCD include physics, medicine, financial analysis, and machine learning, where it is commonly used for the analysis of signal temporal patterns, optimized parameters, derivatives, and integrals of decomposition. Furthermore, FCD can be applied for physical analysis and feature extraction with an average SRMSE of 0.735 per segment and a speed of 0.47s on full decomposition of 1,000 points. Finally, we demonstrate that a Convolutional Neural Network (CNN) enhanced with FCD features, such as optimized function values, parameters, and derivatives, achieved 16.8% faster convergence and 2.5% higher accuracy over a standard CNN.
PDF12February 27, 2026