--- license: mit --- # Large Time-Series Model (Timer) [Large time-series model](https://cloud.tsinghua.edu.cn/f/b766629dbc584a4e8563/) introduced in this [paper](https://arxiv.org/abs/2402.02368) and enhanced with our [further work](https://arxiv.org/abs/2410.04803). The base version is pre-trained on **260B** time points, which supports zero-shot forecasting ([benchmark](https://cdn-uploads.huggingface.co/production/uploads/64fbe24a2d20ced4e91de38a/n2IW7fTRpuZFMYoPr1h4O.png)) and adaptations. See this [Github Page](https://github.com/thuml/Large-Time-Series-Model) for examples of using this model. ## Acknowledgments Timer is mostly built from the Internet public time series dataset, which comes from different research teams and providers. We sincerely thank all individuals and organizations who have contributed the data. Without their generous sharing, this model would not have existed. * Time-Series-Library (https://github.com/thuml/Time-Series-Library) * UTSD (https://huggingface.co/datasets/thuml/UTSD) * LOTSA (https://huggingface.co/datasets/Salesforce/lotsa_data) ## Citation ``` @inproceedings{liutimer, title={Timer: Generative Pre-trained Transformers Are Large Time Series Models}, author={Liu, Yong and Zhang, Haoran and Li, Chenyu and Huang, Xiangdong and Wang, Jianmin and Long, Mingsheng}, booktitle={Forty-first International Conference on Machine Learning} } @article{liu2024timer, title={Timer-XL: Long-Context Transformers for Unified Time Series Forecasting}, author={Liu, Yong and Qin, Guo and Huang, Xiangdong and Wang, Jianmin and Long, Mingsheng}, journal={arXiv preprint arXiv:2410.04803}, year={2024} } ```