Model Launch1d ago
Google's 200M-parameter time-series foundation model with 16k context

What it is
TimesFM is a foundation model for time-series forecasting. Picture a language model, but instead of predicting the next word, it predicts the next number in a sequence. Feed it sales figures, CPU usage, or sensor data—it forecasts future values without dataset-specific training. The 16k context window means it can look back at 16,000 previous data points to make predictions.
Why it matters
Most time-series models require painful per-dataset training. TimesFM works zero-shot: load your CSV, get forecasts. If you're building anomaly detection, demand planning, or infrastructure monitoring, this cuts weeks of ML work into API calls. It's production-ready (Apache 2.0) and runs locally—no vendor lock-in.
Key details
- •200 million parameters — small enough to run on single GPUs, large enough for complex patterns
- •16,384 token context window — equivalent to roughly 3 years of daily data or 16 days of hourly readings
- •Trained on 100+ billion time points from retail, web traffic, manufacturing, and finance datasets
- •Zero-shot forecasting — no fine-tuning required for new datasets
- •Apache 2.0 license on GitHub — free for commercial use, inference code included