timer-base-84m / README.md
Yong99's picture
Update README.md
fbd9db4 verified
|
raw
history blame
1.67 kB
metadata
license: mit

Large Time-Series Model (Timer)

Large time-series model introduced in this paper and enhanced with our further work.

The base version is pre-trained on 260B time points, which supports zero-shot forecasting (benchmark) and adaptations.

See this Github Page for examples of using this model.

Acknowledgments

Timer is mostly built from the Internet public time series dataset, which comes from different research teams and providers. We sincerely thank all individuals and organizations who have contributed the data. Without their generous sharing, this model would not have existed.

Citation

@inproceedings{liutimer,
  title={Timer: Generative Pre-trained Transformers Are Large Time Series Models},
  author={Liu, Yong and Zhang, Haoran and Li, Chenyu and Huang, Xiangdong and Wang, Jianmin and Long, Mingsheng},
  booktitle={Forty-first International Conference on Machine Learning}
}

@article{liu2024timer,
  title={Timer-XL: Long-Context Transformers for Unified Time Series Forecasting},
  author={Liu, Yong and Qin, Guo and Huang, Xiangdong and Wang, Jianmin and Long, Mingsheng},
  journal={arXiv preprint arXiv:2410.04803},
  year={2024}
}