Update README.md
Browse files
    	
        README.md
    CHANGED
    
    | @@ -12,6 +12,10 @@ tags: | |
| 12 |  | 
| 13 | 
             
            # Chronos-T5 (Small)
         | 
| 14 |  | 
|  | |
|  | |
|  | |
|  | |
| 15 | 
             
            Chronos is a family of **pretrained time series forecasting models** based on language model architectures. A time series is transformed into a sequence of tokens via scaling and quantization, and a language model is trained on these tokens using the cross-entropy loss. Once trained, probabilistic forecasts are obtained by sampling multiple future trajectories given the historical context. Chronos models have been trained on a large corpus of publicly available time series data, as well as synthetic data generated using Gaussian processes.
         | 
| 16 |  | 
| 17 | 
             
            For details on Chronos models, training data and procedures, and experimental results, please refer to the paper [Chronos: Learning the Language of Time Series](https://arxiv.org/abs/2403.07815).
         | 
|  | |
| 12 |  | 
| 13 | 
             
            # Chronos-T5 (Small)
         | 
| 14 |  | 
| 15 | 
            +
            🚀 **Update Feb 14, 2025**: Chronos-Bolt & original Chronos models are now available on Amazon SageMaker JumpStart! Check out the [tutorial notebook](https://github.com/amazon-science/chronos-forecasting/blob/main/notebooks/deploy-chronos-bolt-to-amazon-sagemaker.ipynb) to learn how to deploy Chronos endpoints for production use in a few lines of code.
         | 
| 16 | 
            +
             | 
| 17 | 
            +
            🚀 **Update Nov 27, 2024**: We have released Chronos-Bolt⚡️ models that are more accurate (5% lower error), up to 250 times faster and 20 times more memory-efficient than the original Chronos models of the same size. Check out the new models [here](https://huggingface.co/autogluon/chronos-bolt-base).
         | 
| 18 | 
            +
             | 
| 19 | 
             
            Chronos is a family of **pretrained time series forecasting models** based on language model architectures. A time series is transformed into a sequence of tokens via scaling and quantization, and a language model is trained on these tokens using the cross-entropy loss. Once trained, probabilistic forecasts are obtained by sampling multiple future trajectories given the historical context. Chronos models have been trained on a large corpus of publicly available time series data, as well as synthetic data generated using Gaussian processes.
         | 
| 20 |  | 
| 21 | 
             
            For details on Chronos models, training data and procedures, and experimental results, please refer to the paper [Chronos: Learning the Language of Time Series](https://arxiv.org/abs/2403.07815).
         | 

