pith. machine review for the scientific record. sign in

Mantis: A Foundation Model for Mechanistic Disease Forecasting

3 Pith papers cite this work. Polarity classification is still indexing.

3 Pith papers citing it
abstract

Infectious disease forecasting in novel outbreaks or low-resource settings is hampered by the need for large disease and covariate data sets, bespoke training, and expert tuning, all of which can hinder rapid generation of forecasts for new settings. To help address these challenges, we developed Mantis, a foundation model trained entirely on mechanistic simulations, which enables out-of-the-box forecasting across diseases, regions, and outcomes, even in settings with limited historical data. We evaluated Mantis against 78 forecasting models across sixteen diseases with diverse modes of transmission, assessing both point forecast accuracy (mean absolute error) and probabilistic performance (weighted interval score and coverage). Despite using no real-world data during training, Mantis achieved lower mean absolute error than all models in the CDC's COVID-19 Forecast Hub when backtested on early pandemic forecasts which it had not previously seen. Across all other diseases tested, Mantis consistently ranked in the top two models across evaluation metrics. Mantis further generalized to diseases with transmission mechanisms not represented in its training data, demonstrating that it can capture fundamental contagion dynamics rather than memorizing disease-specific patterns. These capabilities illustrate that purely simulation-based foundation models such as Mantis can provide a practical foundation for disease forecasting: general-purpose, accurate, and deployable where traditional models struggle.

years

2026 3

representative citing papers

In-Context Learning Under Regime Change

cs.LG · 2026-04-18 · unverdicted · novelty 6.0

Transformers can solve in-context change-point detection with model size scaling by knowledge of the shift timing, matching optimal baselines on synthetic data and improving pretrained models on disease and financial forecasting.

citing papers explorer

Showing 3 of 3 citing papers.

  • Latent Chain-of-Thought Improves Structured-Data Transformers cs.LG · 2026-05-11 · unverdicted · none · ref 19 · internal anchor

    Latent chain-of-thought via recurrent feedback tokens improves average performance of structured-data transformers on time-series forecasting and tabular prediction.

  • In-Context Learning Under Regime Change cs.LG · 2026-04-18 · unverdicted · none · ref 28 · internal anchor

    Transformers can solve in-context change-point detection with model size scaling by knowledge of the shift timing, matching optimal baselines on synthetic data and improving pretrained models on disease and financial forecasting.

  • Prediction Markets Underperform Simple Baselines For Infectious Disease Forecasting stat.AP · 2026-05-11 · conditional · none · ref 5 · internal anchor

    Prediction markets fail to outperform standard benchmarks for forecasting influenza hospitalizations and measles cases.