ChatPaper.aiChatPaper

Toto:面向可观测性的时间序列优化Transformer

Toto: Time Series Optimized Transformer for Observability

July 10, 2024
作者: Ben Cohen, Emaad Khwaja, Kan Wang, Charles Masson, Elise Ramé, Youssef Doubli, Othmane Abou-Amal
cs.AI

摘要

本技术报告描述了时间序列优化Transformer for Observability(Toto),这是由Datadog开发的用于时间序列预测的新一代基础模型。除了在诸如电力和天气等领域的广义时间序列基准上推进技术水平外,该模型是第一个专门针对可观测性指标进行调整的通用时间序列预测基础模型。 Toto是在一万亿时间序列数据点的数据集上进行训练的,这是目前所有已发布的时间序列基础模型中最大的数据集。除了公开可用的时间序列数据集外,用于训练Toto的数据中,有75%是来自Datadog平台的完全匿名的数值度量数据点。 在我们的实验中,Toto在可观测性数据上优于现有的时间序列基础模型。它不仅在通用预测任务上表现出色,而且在多个公开基准数据集上实现了最先进的零-shot性能。
English
This technical report describes the Time Series Optimized Transformer for Observability (Toto), a new state of the art foundation model for time series forecasting developed by Datadog. In addition to advancing the state of the art on generalized time series benchmarks in domains such as electricity and weather, this model is the first general-purpose time series forecasting foundation model to be specifically tuned for observability metrics. Toto was trained on a dataset of one trillion time series data points, the largest among all currently published time series foundation models. Alongside publicly available time series datasets, 75% of the data used to train Toto consists of fully anonymous numerical metric data points from the Datadog platform. In our experiments, Toto outperforms existing time series foundation models on observability data. It does this while also excelling at general-purpose forecasting tasks, achieving state-of-the-art zero-shot performance on multiple open benchmark datasets.

Summary

AI-Generated Summary

PDF333November 28, 2024