Top AI Product

Every day, hundreds of new AI tools launch across Product Hunt, Hacker News, and GitHub. We dig through the noise so you don't have to — surfacing only the ones worth your attention with honest, no-fluff reviews. Explore our latest picks, deep dives, and curated collections to find your next favorite AI tool.


Google TimesFM Turns BigQuery into a Forecasting Engine — 200M Parameters, Zero Training, 16K Context

Time series forecasting is one of those problems that sounds simple and is absolutely not. You want to predict next week’s sales, next month’s server load, next quarter’s energy demand. Traditional approach: hire a data scientist, collect historical data, pick a model (ARIMA, Prophet, maybe a custom LSTM), train it, tune it, deploy it, and pray it doesn’t break when the data distribution shifts. Repeat for every new dataset.

Google Research looked at this and asked the obvious question: what if one model could forecast anything, on any dataset, without any training at all?

That model is TimesFM. It landed on Hacker News on March 31, 2026, pulling 170 points and 77 comments with the headline “Google’s 200M-parameter time-series foundation model with 16k context.” The repo has accumulated over 9,000 GitHub stars. And the reason it’s generating this much attention right now isn’t just the model itself — it’s that Google quietly integrated it into BigQuery and AlloyDB, meaning any team with a Google Cloud account can start forecasting without writing a single line of ML code.

Zero-Shot Forecasting and Why It Changes the Economics

The core idea behind TimesFM is zero-shot forecasting. You give it a time series it has never seen before — could be retail sales in Tokyo, hospital admissions in São Paulo, or electricity prices in Berlin — and it produces a forecast without any fine-tuning or dataset-specific training.

This is the same conceptual leap that GPT brought to text. Before foundation models, every NLP task needed its own trained model. After, you could throw arbitrary text at one model and get useful results. TimesFM does this for time series.

The practical impact is enormous. Most companies that do time series forecasting spend 70-80% of their time on data preparation, model selection, and training. TimesFM collapses that entire workflow. You feed in historical data points, you get a forecast out. No hyperparameter tuning, no cross-validation loops, no retraining when seasonality patterns change.

Google pre-trained TimesFM on over 100 billion real-world time points — a mix of synthetic data and actual Google-scale datasets covering retail, finance, weather, traffic, and more. The scale of the training data is what makes zero-shot work. The model has seen so many different temporal patterns that when it encounters your specific dataset, it already has the statistical intuition to handle it.

On the GIFT-Eval benchmark — the most comprehensive zero-shot forecasting evaluation currently available — TimesFM 2.5 ranked number one among zero-shot foundation models on both MASE (point accuracy) and CRPS (probabilistic accuracy). That’s not a narrow win on one metric. It topped both the “how close are your point predictions” and “how well-calibrated are your uncertainty estimates” leaderboards simultaneously.

200 Million Parameters: Deliberately Small, Surprisingly Powerful

Here’s the counterintuitive part. TimesFM 2.5 runs on just 200 million parameters. For context, GPT-4 is estimated at over a trillion parameters. Even small language models like Llama-3 8B are 40 times larger. TimesFM achieves state-of-the-art forecasting with a model you can run on modest hardware.

Version 2.0 of TimesFM actually had 500 million parameters. Google cut that in half for version 2.5 while simultaneously improving accuracy. That’s the kind of engineering move that signals real architectural maturity — not just throwing more compute at the problem, but finding more efficient ways to represent temporal knowledge.

The architecture is a decoder-only transformer, similar in spirit to GPT but purpose-built for numerical sequences instead of text tokens. The key innovation is the patch-based input processing. Instead of treating each individual time point as a token (which would be wildly inefficient for sequences of thousands of data points), TimesFM groups every 32 consecutive time points into a single “patch.” Each patch gets transformed into a vector through a residual block, then processed through stacked transformer layers with causal multi-head attention.

The output side mirrors this approach. Each output token maps to 128 time points of forecast, meaning the model can produce long horizons efficiently. Combined with the 16K context window — up from just 2,048 in earlier versions — TimesFM can now ingest over 16,000 historical data points in a single forward pass. That’s enough to capture multi-seasonal structures, regime changes, and low-frequency cycles that shorter-context models would completely miss.

For anyone who’s worked with time series in practice, that 16K context window is a big deal. Retail demand often has weekly, monthly, quarterly, and yearly seasonality stacked on top of each other. Energy load data has daily and seasonal patterns with multi-year trends. You need long context to capture all of these without manual feature engineering.

TimesFM 2.5 also added native probabilistic forecasting through an optional 30M quantile head, producing continuous quantile forecasts up to a 1,000-step horizon. So you don’t just get a point prediction — you get calibrated uncertainty bands. For risk-sensitive domains like finance and supply chain, that’s the difference between a useful forecast and a decoration.

BigQuery and AlloyDB: From Research Paper to Production SQL

This is where TimesFM goes from “interesting research” to “I can use this Monday morning.”

Google integrated TimesFM 2.5 directly into BigQuery and AlloyDB. The AI.FORECAST function is now generally available — GA, not preview, not beta. You write a SQL query, point it at your time series table, and get forecasts back. No Python, no Jupyter notebooks, no model deployment pipeline.

The BigQuery integration includes AI.FORECAST for generating predictions, AI.EVALUATE for checking forecast quality, and a new AI.DETECT_ANOMALIES function (currently in preview) for identifying unusual patterns. The context window dynamically adjusts up to 15K data points based on your input data.

AlloyDB — Google’s PostgreSQL-compatible operational database — gets the same treatment. This is particularly interesting because AlloyDB is where your transactional data lives. Real-time sales, sensor readings, operational metrics. Having forecasting built into the operational database means you can run predictions against live data without ETL pipelines or data warehouse roundtrips.

Google also opened up TimesFM’s forecasting through agentic interfaces, including the Agent Development Kit (ADK), MCP Toolbox for Databases, and the Gemini CLI extension. So if you’re building AI agents that need to make decisions based on future projections — think inventory management bots, automated pricing systems, capacity planning agents — TimesFM is accessible as a tool call.

For teams already on Google Cloud, this removes the last barrier. You don’t need ML expertise. You don’t need a forecasting team. You need a SQL query. This is the kind of infrastructure move that, similar to what Google did with TurboQuant for LLM inference optimization, takes a research breakthrough and turns it into a commodity available to every developer on the platform.

The Foundation Model Forecasting Race: TimesFM vs. Chronos vs. Moirai

TimesFM doesn’t exist in a vacuum. Amazon’s Chronos and Salesforce’s Moirai are the two main competitors, and each takes a meaningfully different architectural approach.

Chronos, developed by Amazon, repurposes the T5 language model architecture for time series. It literally tokenizes numerical values into bins and treats forecasting as a sequence-to-sequence translation problem. Clever hack, and it works surprisingly well. Chronos-2, released in late 2025, actually overtook TimesFM 2.5 on GIFT-Eval, achieving higher win rates on both WQL and MASE metrics. Amazon’s approach of leveraging existing LLM infrastructure gives them rapid iteration speed.

Moirai, from Salesforce, takes a third path. It’s an encoder-only transformer that supports multivariate forecasting — meaning it can handle multiple correlated time series simultaneously and incorporate external variables (like weather data or economic indicators) as inputs. Neither TimesFM nor Chronos natively supports this. Moirai-MoE, the mixture-of-experts variant, achieves competitive accuracy with up to 65 times fewer activated parameters than its rivals. For problems where you need to model relationships between multiple time series, Moirai has a structural advantage.

The trade-offs break down like this. TimesFM’s strength is the combination of accuracy, speed, and enterprise integration — it’s the only one baked directly into a major cloud data warehouse. Chronos has the edge on raw benchmark performance after the v2 update, plus Amazon’s ecosystem gives it a natural home in SageMaker and AWS analytics. Moirai wins on multivariate capability and parameter efficiency, but lacks the seamless cloud integration that makes TimesFM and Chronos easy to deploy.

For most enterprise teams, though, the integration story matters more than benchmark deltas. A model that’s 2% better on MASE but requires you to set up a custom inference pipeline is less useful than one that’s a SQL function call away. This is Google’s bet with TimesFM, and based on the Hacker News reaction and developer discussion, it’s resonating.

The time series foundation model space is still early. A year ago, most practitioners hadn’t heard of any of these models. Now there are at least five serious contenders, benchmarks are maturing, and cloud providers are racing to integrate them as first-class services. If you’re still hand-tuning ARIMA models for each new dataset, the window for that approach is closing fast.


You Might Also Like


Discover more from Top AI Product

Subscribe to get the latest posts sent to your email.



Leave a comment