Papers
arxiv:2505.14766

This Time is Different: An Observability Perspective on Time Series Foundation Models

Published on May 20
· Submitted by Emaad on May 22
Authors:
,
,
,
,
,
,
,
,
,
,
,
,
,
,
,
,

Abstract

Toto is a time series forecasting foundation model using a decoder-only architecture, demonstrating state-of-the-art performance on large-scale benchmarks with multivariate observability data.

AI-generated summary

We introduce Toto, a time series forecasting foundation model with 151 million parameters. Toto uses a modern decoder-only architecture coupled with architectural innovations designed to account for specific challenges found in multivariate observability time series data. Toto's pre-training corpus is a mixture of observability data, open datasets, and synthetic data, and is 4-10times larger than those of leading time series foundation models. Additionally, we introduce BOOM, a large-scale benchmark consisting of 350 million observations across 2,807 real-world time series. For both Toto and BOOM, we source observability data exclusively from Datadog's own telemetry and internal observability metrics. Extensive evaluations demonstrate that Toto achieves state-of-the-art performance on both BOOM and on established general purpose time series forecasting benchmarks. Toto's model weights, inference code, and evaluation scripts, as well as BOOM's data and evaluation code, are all available as open source under the Apache 2.0 License available at https://huggingface.co/Datadog/Toto-Open-Base-1.0 and https://github.com/DataDog/toto.

Community

Paper submitter

We are excited to announce a new open-weights release of Toto, our SOTA time series foundation model , and BOOM, a new public observability benchmark that contains 350 million observations across 2,807 real-world time series.

Both are open source under the Apache 2.0 license and available to use immediately!

Sign up or log in to comment

Models citing this paper 1

Datasets citing this paper 1

Spaces citing this paper 1

Collections including this paper 1