Time-MoE: The Latest Foundation Forecasting Model | by Marco Peixeiro | Oct, 2024


Explore the open-source large time model Time-MoE and apply it in a small experiment using Python

Towards Data Science

12 min read

13 hours ago

Photo by Irina Iriser on Unsplash

Traditionally, the field of time series forecasting relied on data-specific models, where a model was trained on a specific dataset and task. If the data or the forecast horizon changed, the model also had to be changed.

Since October 2023, researchers have been actively developing foundation forecasting models. With these large time models, a single model can now handle different forecasting tasks from different domains, at different frequencies, and with virtually any forecast horizon.

Such large time models include:

  • TimeGPT, which is accessed via API making it easy to perform forecasting, fine-tuning without using local resources
  • Lag-Llama, an open-source model for probabilistic forecasting that constructs features from lagged values
  • Chronos, a model based on T5 that translated the unbounded time series domain to the bounded language domain through tokenization and quantization
  • Moirai, a model that supports exogenous features and the first to publicly share their dataset LOTSA containing more than 27B data points.

Recent Articles

Related Stories

Leave A Reply

Please enter your comment!
Please enter your name here