SpatialEpiBench shows adjacency-informed models with epidemic priors underperform a last-value baseline across 11 datasets from 1 day to 1 month ahead, identifying failures in outbreak anticipation, sparsity handling, and geographic adjacency utility.
hub
Diffusion convolutional recurrent neural network: Data-driven traffic forecasting
11 Pith papers cite this work. Polarity classification is still indexing.
hub tools
representative citing papers
AirQualityBench is a realistic global benchmark using hourly data from 3720 stations across 2021-2025 for six pollutants, preserving native missingness masks and evaluating on inverse-transformed physical scales.
TRIP-Evaluate is a new open multimodal benchmark with 837 text, image, and point-cloud items organized by a role-task-knowledge taxonomy to evaluate large models on transportation workflows.
MobiWM is a multimodal world model for mobile networks that learns state-action dynamics to enable unlimited-horizon counterfactual traffic simulations and optimization.
SCOT uses Sinkhorn entropic optimal transport to learn explicit soft correspondences between unequal region sets for multi-source cross-city transfer, adding contrastive sharpening and cycle reconstruction for stability and a prototype hub for multi-source alignment.
ST-PT turns transformers into explicit factor graphs for time series, enabling structural injection of symbolic priors, per-sample conditional generation, and principled latent autoregressive forecasting via MFVI iterations.
CAARL decomposes co-evolving time series into autoregressive segments, builds a temporal dependency graph, serializes it into a narrative, and uses LLMs for interpretable forecasting via chain-of-thought reasoning.
GAMMA-Net combines Graph Attention Networks and multi-axis Mamba to outperform prior models in long-horizon traffic forecasting, with up to 16.25% lower MAE on benchmarks like METR-LA and PEMS datasets.
Graph networks unify graph-based neural methods into a general framework with strong relational inductive biases to support combinatorial generalization and structured reasoning in AI.
TSNN matches time series entries to a training-derived memory bank to forecast traffic without any trainable parameters and achieves competitive accuracy on four real-world datasets.
SimpleST is a model-agnostic prompt tuning framework that lets pre-trained spatio-temporal GNNs adapt to distribution shifts in traffic data while keeping all original model weights fixed.
citing papers explorer
-
SpatialEpiBench: Benchmarking Spatial Information and Epidemic Priors in Forecasting
SpatialEpiBench shows adjacency-informed models with epidemic priors underperform a last-value baseline across 11 datasets from 1 day to 1 month ahead, identifying failures in outbreak anticipation, sparsity handling, and geographic adjacency utility.
-
AirQualityBench: A Realistic Evaluation Benchmark for Global Air Quality Forecasting
AirQualityBench is a realistic global benchmark using hourly data from 3720 stations across 2021-2025 for six pollutants, preserving native missingness masks and evaluating on inverse-transformed physical scales.
-
TRIP-Evaluate: An Open Multimodal Benchmark for Evaluating Large Models in Transportation
TRIP-Evaluate is a new open multimodal benchmark with 837 text, image, and point-cloud items organized by a role-task-knowledge taxonomy to evaluate large models on transportation workflows.
-
Beyond Static Forecasting: Unleashing the Power of World Models for Mobile Traffic Extrapolation
MobiWM is a multimodal world model for mobile networks that learns state-action dynamics to enable unlimited-horizon counterfactual traffic simulations and optimization.
-
SCOT: Multi-Source Cross-City Transfer with Optimal-Transport Soft-Correspondence Objective
SCOT uses Sinkhorn entropic optimal transport to learn explicit soft correspondences between unequal region sets for multi-source cross-city transfer, adding contrastive sharpening and cycle reconstruction for stability and a prototype hub for multi-source alignment.
-
Exploring the Potential of Probabilistic Transformer for Time Series Modeling: A Report on the ST-PT Framework
ST-PT turns transformers into explicit factor graphs for time series, enabling structural injection of symbolic priors, per-sample conditional generation, and principled latent autoregressive forecasting via MFVI iterations.
-
CAARL: In-Context Learning for Interpretable Co-Evolving Time Series Forecasting
CAARL decomposes co-evolving time series into autoregressive segments, builds a temporal dependency graph, serializes it into a narrative, and uses LLMs for interpretable forecasting via chain-of-thought reasoning.
-
GAMMA-Net: Adaptive Long-Horizon Traffic Spatio-Temporal Forecasting Model based on Interleaved Graph Attention and Multi-Axis Mamba
GAMMA-Net combines Graph Attention Networks and multi-axis Mamba to outperform prior models in long-horizon traffic forecasting, with up to 16.25% lower MAE on benchmarks like METR-LA and PEMS datasets.
-
Relational inductive biases, deep learning, and graph networks
Graph networks unify graph-based neural methods into a general framework with strong relational inductive biases to support combinatorial generalization and structured reasoning in AI.
-
TSNN: A Non-parametric and Interpretable Framework for Traffic Time Series Forecasting
TSNN matches time series entries to a training-derived memory bank to forecast traffic without any trainable parameters and achieves competitive accuracy on four real-world datasets.
-
Efficient Prompt Learning for Traffic Forecasting
SimpleST is a model-agnostic prompt tuning framework that lets pre-trained spatio-temporal GNNs adapt to distribution shifts in traffic data while keeping all original model weights fixed.