Chronos:学习时间序列的语言

Chronos: Learning the Language of Time Series

March 12, 2024
作者: Abdul Fatir Ansari, Lorenzo Stella, Caner Turkmen, Xiyuan Zhang, Pedro Mercado, Huibin Shen, Oleksandr Shchur, Syama Sundar Rangapuram, Sebastian Pineda Arango, Shubham Kapoor, Jasper Zschiegner, Danielle C. Maddix, Michael W. Mahoney, Kari Torkkola, Andrew Gordon Wilson, Michael Bohlke-Schneider, Yuyang Wang
cs.AI

摘要

我们介绍了Chronos,这是一个简单而有效的预训练概率时间序列模型框架。Chronos使用缩放和量化将时间序列值标记为固定词汇,并通过交叉熵损失在这些标记化的时间序列上训练现有的基于Transformer的语言模型架构。我们基于T5系列(参数范围从20M到710M)在大量公开可用数据集上预训练了Chronos模型,同时通过高斯过程生成了一个合成数据集以提高泛化能力。在包含42个数据集的全面基准测试中,涵盖了传统的本地模型和深度学习方法,我们展示了Chronos模型:(a)在训练语料库中的数据集上明显优于其他方法;以及(b)在新数据集上具有可比和偶尔优越的零样本性能,相对于专门针对它们进行训练的方法。我们的结果表明,Chronos模型可以利用来自不同领域的时间序列数据,提高对未见预测任务的零样本准确性,将预训练模型定位为极大简化预测流程的可行工具。
English
We introduce Chronos, a simple yet effective framework for pretrained probabilistic time series models. Chronos tokenizes time series values using scaling and quantization into a fixed vocabulary and trains existing transformer-based language model architectures on these tokenized time series via the cross-entropy loss. We pretrained Chronos models based on the T5 family (ranging from 20M to 710M parameters) on a large collection of publicly available datasets, complemented by a synthetic dataset that we generated via Gaussian processes to improve generalization. In a comprehensive benchmark consisting of 42 datasets, and comprising both classical local models and deep learning methods, we show that Chronos models: (a) significantly outperform other methods on datasets that were part of the training corpus; and (b) have comparable and occasionally superior zero-shot performance on new datasets, relative to methods that were trained specifically on them. Our results demonstrate that Chronos models can leverage time series data from diverse domains to improve zero-shot accuracy on unseen forecasting tasks, positioning pretrained models as a viable tool to greatly simplify forecasting pipelines.

Summary

AI-Generated Summary

PDF485December 15, 2024