Toto:針對可觀察性進行時間序列優化的Transformer
Toto: Time Series Optimized Transformer for Observability
July 10, 2024
作者: Ben Cohen, Emaad Khwaja, Kan Wang, Charles Masson, Elise Ramé, Youssef Doubli, Othmane Abou-Amal
cs.AI
摘要
本技術報告描述了時間序列優化Transformer for Observability(Toto),這是由Datadog開發的一種新的頂尖基礎模型,用於時間序列預測。除了在通用時間序列基準上取得了最新技術進展,如電力和天氣等領域,此模型是第一個專門針對可觀察性指標進行調整的通用時間序列預測基礎模型。
Toto是在一兆個時間序列數據點的數據集上進行訓練的,這是目前所有已發表的時間序列基礎模型中最大的數據集。除了公開可用的時間序列數據集外,用於訓練Toto的數據中,有75%是來自Datadog平台的完全匿名的數值指標數據點。
在我們的實驗中,Toto在可觀察性數據上優於現有的時間序列基礎模型。它在優秀地完成通用預測任務的同時,在多個開放基準數據集上實現了最新技術的零-shot表現。
English
This technical report describes the Time Series Optimized Transformer for
Observability (Toto), a new state of the art foundation model for time series
forecasting developed by Datadog. In addition to advancing the state of the art
on generalized time series benchmarks in domains such as electricity and
weather, this model is the first general-purpose time series forecasting
foundation model to be specifically tuned for observability metrics.
Toto was trained on a dataset of one trillion time series data points, the
largest among all currently published time series foundation models. Alongside
publicly available time series datasets, 75% of the data used to train Toto
consists of fully anonymous numerical metric data points from the Datadog
platform.
In our experiments, Toto outperforms existing time series foundation models
on observability data. It does this while also excelling at general-purpose
forecasting tasks, achieving state-of-the-art zero-shot performance on multiple
open benchmark datasets.Summary
AI-Generated Summary