A new age for time series

Created by author using DALLE*3

Google just entered the race of foundation models for time-series forecasting.

In August 2023, the time-series community was disrupted by the release of TimeGPT, Nixtla’s first foundation model for time series forecasting.

Following TimeGPT, multiple foundation forecasting models were released, but there was one that stood out. Recently, Google unveiled TimesFM[1], a groundbreaking time-series model with phenomenal results.

Time series are ubiquitous, used in many domains like retail, energy demand, economics, healthcare and more. A foundation TS model can be readily applied to any TS case with great accuracy, like GPT-4 for text.

In this article, we discuss:

  1. The challenges of foundation models in time series compared to NLP.
  2. How TimesFM overcomes these challenges.
  3. How TimesFM works and why it’s a powerful model.
  4. TimesFM benchmark results.
  5. Prospects for the future of foundation models in time-series forecasting

Let’s get started.

I’ve launched AI Horizon Forecast, a newsletter focusing on time-series and innovative AI research. Subscribe here to broaden your horizons!

The concept of a promising foundation model in NLP was already evident with the release of GPT-2 in Language Models are Unsupervised Multitask Learners [2].

But in time series, building a foundation model is not straightforward. There are several challenges:

  • Dataset Scarcity: In NLP, finding text data is easy. However, public time-series datasets are not readily available.
  • Unpredictable Format: Language models are based on well-defined grammars and vocabularies. Time-series data may belong to domains with different characteristics — e.g., highly sparse sales or volatile financial data.
  • Different Granularities: Each time series model works for a specific granularity — e.g., hourly, weekly, monthly…