vasilii-feofanov commited on
Commit
3113da9
·
verified ·
1 Parent(s): 572b9c9

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -4
README.md CHANGED
@@ -33,10 +33,10 @@ Paris Noah's Ark Lab consists of 3 research teams that cover the following topic
33
  - [A Systematic Study Comparing Hyperparameter Optimization Engines on Tabular Data](https://balazskegl.medium.com/navigating-the-maze-of-hyperparameter-optimization-insights-from-a-systematic-study-6019675ea96c): insights to navigate the maze of hyperopt techniques.
34
 
35
  ### 2025
36
- - *(ICML'25)* - [AdaPTS: Adapting Univariate Foundation Models to Probabilistic Multivariate Time Series Forecasting](https://arxiv.org/abs/2502.10235): simple yet powerful tricks to extend foundation models.
37
- - *(ICLR'25)* - [Zero-shot Model-based Reinforcement Learning using Large Language Models](https://huggingface.co/papers/2410.11711): disentangled in-context learning for multivariate time series forecasting and model-based RL.
38
- - *(ICASSP'25)* - [Easing Optimization Paths: A Circuit Perspective](https://arxiv.org/abs/2501.02362): mechanistic study of training dynamics in transformers.
39
- - *(Neurocomputing)* - [Self-training: A survey](https://www.sciencedirect.com/science/article/pii/S0925231224016758): know more about pseudo-labeling strategies.
40
 
41
  ### 2024
42
 
 
33
  - [A Systematic Study Comparing Hyperparameter Optimization Engines on Tabular Data](https://balazskegl.medium.com/navigating-the-maze-of-hyperparameter-optimization-insights-from-a-systematic-study-6019675ea96c): insights to navigate the maze of hyperopt techniques.
34
 
35
  ### 2025
36
+ - *(ICML'25)* [AdaPTS: Adapting Univariate Foundation Models to Probabilistic Multivariate Time Series Forecasting](https://arxiv.org/abs/2502.10235): simple yet powerful tricks to extend foundation models.
37
+ - *(ICLR'25)* [Zero-shot Model-based Reinforcement Learning using Large Language Models](https://huggingface.co/papers/2410.11711): disentangled in-context learning for multivariate time series forecasting and model-based RL.
38
+ - *(ICASSP'25)* [Easing Optimization Paths: A Circuit Perspective](https://arxiv.org/abs/2501.02362): mechanistic study of training dynamics in transformers.
39
+ - *(Neurocomputing)* [Self-training: A survey](https://www.sciencedirect.com/science/article/pii/S0925231224016758): know more about pseudo-labeling strategies.
40
 
41
  ### 2024
42