Time series forecasting just got its GPT moment. Google Research dropped TimesFM, a decoder-only foundation model that treats forecasting like language modeling - and it actually works. Unlike traditional methods that need domain expertise and feature engineering for each dataset, TimesFM learns universal patterns across time series and generalizes to new data zero-shot. The 2.5 version is already integrated into BigQuery as an official Google product, so you know it’s battle-tested at scale.

What sets TimesFM apart is the engineering intelligence: 200M parameters (down from 500M) while supporting 16k context length (up from 2048), continuous quantile forecasting for uncertainty estimation, and smart preprocessing that handles normalization and positive value inference automatically. The model eliminates the frequency indicator requirement and includes covariate support through XReg. Installation takes minutes with clear PyTorch/JAX backend options, and the API is refreshingly straightforward - no PhD in statistics required.

This hits the sweet spot for data scientists tired of hand-tuning ARIMA models and ML engineers wanting production-ready forecasting without the complexity of building custom transformers. With 9.4k stars and backing from Google Research’s ICML 2024 paper, the momentum is real. The codebase is actively maintained with recent updates, comprehensive examples, and that rare combination of academic rigor with practical usability.


Stars: 9443
💻 Language: Python
🔗 Repository: google-research/timesfm