pith. machine review for the scientific record. sign in

arxiv: 2605.03386 · v1 · submitted 2026-05-05 · 💻 cs.LG · cs.AI

Recognition: unknown

Local Truncation Error-Guided Neural ODEs for Large Scale Traffic Forecasting

Authors on Pith no claims yet

Pith reviewed 2026-05-07 17:27 UTC · model grok-4.3

classification 💻 cs.LG cs.AI
keywords Neural ODEtraffic forecastinglocal truncation errorspatiotemporal predictionattention maskcontinuous-discrete dynamicsshock handling
0
0 comments X

The pith

Repurposing local truncation errors lets Neural ODEs handle both smooth traffic flows and abrupt shocks.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

Traffic networks combine steady macroscopic patterns with sudden microscopic disruptions, yet standard Neural ODEs enforce Lipschitz continuity that over-smooths shocks. Earlier fixes that penalize integration errors create gradient conflicts and lose sensitivity to anomalies. This paper instead treats the local truncation error itself as an inductive bias, mapping it to a dynamic spatial attention mask that keeps continuous high-precision evolution in stable zones while activating discrete compensation only where shocks occur. The resulting LTE-ODE trains end-to-end without manifold penalties and reports state-of-the-art accuracy on large-scale benchmarks together with robustness to strong nonlinearities.

Core claim

By converting the local truncation error of the ODE integrator into a dynamic spatial attention mask, LTE-ODE preserves continuous, high-precision evolution wherever the traffic state remains stable and activates a discrete compensation branch exclusively at shock locations, all without any explicit manifold regularization or gradient-penalty terms.

What carries the argument

Local Truncation Error-Guided Neural ODE (LTE-ODE) that turns the per-step integration error into an unsupervised dynamic spatial attention mask.

Load-bearing premise

Mapping the local truncation error to an attention mask will reliably flag only true shocks and will not create new gradient conflicts or attention collapse.

What would settle it

On a traffic dataset containing labeled shock events, measure whether LTE-ODE improves forecast error specifically at those events compared with a plain Neural ODE; if the improvement vanishes or reverses when integration step size changes, the claim fails.

Figures

Figures reproduced from arXiv: 2605.03386 by Mingliang Xu, Ruixiang Wang, Shuo He, Wei Wei, Xiao Zhang, Yafei Li.

Figure 1
Figure 1. Figure 1: The overall framework of the proposed LTE-ODE. (Left) Input & Initialization: The input data X is projected via Winput and concatenated with node embeddings Enode to form initial dual-stream states. (Middle) Continuous-Discrete Evolution Loop: Over S steps, a dual-solver (Euler and RK2) evaluates the continuous vector field fθ. The numerical discrepancy between them yields the Local Truncation Error (LTE) … view at source ↗
Figure 2
Figure 2. Figure 2: Performance Comparison of Training and Inference Time. Notably, while the absolute runtime naturally increases on larger-scale graphs like CA-due to the inherent sensitivity of the ODE solver’s integration steps to massive graph operations-our model maintains robust scalability. Even on the massive CA dataset, LTE-ODE operates up to 4× and 10× faster than PDFormer and UniST, respectively. This demonstrates… view at source ↗
Figure 3
Figure 3. Figure 3: Spatial distribution of the attention mask Mt, demonstrating Attention Collapse. The underlying pathology of the w/ Manifold Penalty variant is unequivocally exposed in the in￾ternal inference dynamics ( view at source ↗
read the original abstract

Spatiotemporal forecasting in physical systems, such as large-scale traffic networks, requires modeling a dual dynamic: continuous macroscopic rhythms and discrete, unpredictable microscopic shocks. While Neural Ordinary Differential Equations (ODEs) excel at capturing smooth evolution, their inherent Lipschitz continuity constraints inevitably cause severe over-smoothing when confronting abrupt anomalies. Recent physics-informed methods attempt to bypass this by penalizing numerical integration errors to enforce manifold smoothness. However, we mathematically reveal that such rigid regularization inherently triggers gradient conflicts and ``attention collapse,'' stripping the model of its sensitivity to anomalies. To resolve this continuity-shock dilemma, we propose Local Truncation Error-Guided Neural ODEs (LTE-ODE). Rather than treating numerical error as a nuisance to be eliminated, we innovatively repurpose the Local Truncation Error (LTE) as an unsupervised forward inductive bias. By mapping the LTE into a dynamic spatial attention mask, our architecture gracefully preserves high-precision continuous ODE evolution in stable regions, while adaptively triggering a discrete compensation branch exclusively at shock points. Trained purely end-to-end without manifold penalties, LTE-ODE achieves state-of-the-art performance on multiple large-scale benchmarks, exhibiting exceptional robustness against highly non-linear fluctuations. Furthermore, our ablation on integration steps demonstrates high deployment flexibility, allowing the model to seamlessly adapt to varying hardware memory constraints in real-world applications.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

2 major / 1 minor

Summary. The manuscript presents Local Truncation Error-Guided Neural ODEs (LTE-ODE) for large-scale traffic forecasting. It identifies a continuity-shock dilemma in Neural ODEs, where Lipschitz continuity causes over-smoothing on abrupt anomalies, and claims that physics-informed methods using manifold penalties lead to gradient conflicts and attention collapse. The proposed solution repurposes the local truncation error (LTE) as an unsupervised forward inductive bias by mapping it to a dynamic spatial attention mask. This allows high-precision continuous ODE evolution in stable regions and adaptive triggering of a discrete compensation branch at shock points. The model is trained end-to-end without manifold penalties and reportedly achieves state-of-the-art performance on multiple benchmarks with robustness to non-linear fluctuations and flexibility in integration steps.

Significance. If the central mechanism holds, this work could significantly advance the application of Neural ODEs to real-world physical systems with mixed continuous and discrete dynamics by providing an inductive bias that avoids the drawbacks of explicit regularization. The emphasis on end-to-end training and the ablation demonstrating adaptability to hardware constraints are notable strengths. However, the overall significance is limited by the absence of detailed verification that the LTE-based mask selectively identifies shocks without introducing new gradient or stability issues, which is the key innovation.

major comments (2)
  1. [Abstract and §3] Abstract and §3: The assertion that the LTE-to-dynamic-spatial-attention-mask mapping triggers compensation 'exclusively at shock points' while avoiding gradient conflicts is central to the contribution but is presented without a supporting theorem, derivation, or targeted ablation study showing the correlation between LTE magnitude and physical discontinuities independent of solver parameters or data noise.
  2. [§5 Experiments] §5 Experiments: The SOTA claims on large-scale benchmarks are made, but the manuscript does not provide sufficient detail on how the discrete compensation branch is implemented or ablated to confirm it is the source of improved robustness against highly non-linear fluctuations.
minor comments (1)
  1. [Abstract] The abstract mentions 'multiple large-scale benchmarks' but does not name them or provide quantitative improvements, which would strengthen the summary.

Simulated Author's Rebuttal

2 responses · 0 unresolved

We thank the referee for their constructive comments, which have helped us improve the clarity and rigor of our work. We address each major comment point by point below.

read point-by-point responses
  1. Referee: [Abstract and §3] Abstract and §3: The assertion that the LTE-to-dynamic-spatial-attention-mask mapping triggers compensation 'exclusively at shock points' while avoiding gradient conflicts is central to the contribution but is presented without a supporting theorem, derivation, or targeted ablation study showing the correlation between LTE magnitude and physical discontinuities independent of solver parameters or data noise.

    Authors: We thank the referee for this observation. The LTE mapping is derived directly from the numerical analysis of ODE solvers, where the local truncation error is known to be larger in regions of high curvature or discontinuities, corresponding to shock points in traffic data. Section 3 presents the mathematical formulation mapping LTE to the attention mask. To further validate the selectivity independent of solver parameters and noise, we will add a dedicated ablation study in the revision that systematically varies these factors and shows the correlation with physical anomalies. This will also address potential concerns about new gradient or stability issues by including stability metrics in the experiments. revision: partial

  2. Referee: [§5 Experiments] §5 Experiments: The SOTA claims on large-scale benchmarks are made, but the manuscript does not provide sufficient detail on how the discrete compensation branch is implemented or ablated to confirm it is the source of improved robustness against highly non-linear fluctuations.

    Authors: We agree that more explicit details on the discrete compensation branch are warranted to substantiate the SOTA claims. In the revised manuscript, we will provide pseudocode and architectural diagrams for the branch, along with a specific ablation experiment that isolates its effect by comparing performance with and without it under non-linear fluctuation scenarios. This will confirm its role in the observed robustness improvements. revision: yes

Circularity Check

0 steps flagged

No significant circularity in the proposed LTE-ODE method

full rationale

The paper's central contribution is a constructive architectural proposal: LTE is computed from the Neural ODE integrator and mapped to a dynamic spatial attention mask to selectively trigger discrete compensation. This mapping is presented as an inductive bias design choice, not as a derived equality or fitted parameter that is then relabeled as a prediction. The abstract's mathematical critique of manifold-penalty methods is an analysis of existing approaches rather than a self-referential loop, and the SOTA claims rest on end-to-end training and benchmark results rather than any reduction of outputs to inputs by definition. No self-citations, uniqueness theorems, or ansatzes imported from prior author work are invoked as load-bearing in the provided text. The derivation chain is therefore self-contained as a novel modeling technique.

Axiom & Free-Parameter Ledger

0 free parameters · 1 axioms · 1 invented entities

Only the abstract is available, limiting visibility into exact parameters and assumptions. The primary invented component is the LTE-derived attention mask; the main domain assumption is the existence of a continuity-shock dilemma in Neural ODEs.

axioms (1)
  • domain assumption Neural ODEs inherently cause severe over-smoothing on abrupt anomalies due to Lipschitz continuity constraints
    This premise is stated directly in the abstract as the starting problem to be solved.
invented entities (1)
  • LTE-based dynamic spatial attention mask no independent evidence
    purpose: To repurpose local truncation error as an unsupervised forward inductive bias that triggers discrete compensation only at shock points
    New architectural component introduced to resolve the stated continuity-shock dilemma without manifold penalties.

pith-pipeline@v0.9.0 · 5550 in / 1424 out tokens · 92217 ms · 2026-05-07T17:27:59.573711+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Reference graph

Works this paper leans on

70 extracted references · 3 canonical work pages · 1 internal anchor

  1. [1]

    Advances in Neural Information Processing Systems , volume=

    Contiformer: Continuous-time transformer for irregular time series modeling , author=. Advances in Neural Information Processing Systems , volume=

  2. [2]

    Advances in neural information processing systems , volume=

    Neural ordinary differential equations , author=. Advances in neural information processing systems , volume=

  3. [3]

    IEEE Transactions on Geoscience and Remote Sensing , volume=

    Dual ODE: Spatial--spectral neural ordinary differential equations for hyperspectral image super-resolution , author=. IEEE Transactions on Geoscience and Remote Sensing , volume=

  4. [4]

    Pattern Recognition , volume=

    Do it yourself dynamic single image super resolution network via ODE , author=. Pattern Recognition , volume=

  5. [5]

    Journal of transportation engineering , volume=

    Modeling and forecasting vehicular traffic flow as a seasonal ARIMA process: Theoretical basis and empirical results , author=. Journal of transportation engineering , volume=

  6. [6]

    Future Generation Computer Systems , volume=

    Hybrid graph convolution neural network and branch-and-bound optimization for traffic flow forecasting , author=. Future Generation Computer Systems , volume=

  7. [7]

    Proceedings of the AAAI conference on artificial intelligence , volume=

    Scalable spatiotemporal graph neural networks , author=. Proceedings of the AAAI conference on artificial intelligence , volume=

  8. [8]

    Proceedings of the AAAI conference on artificial intelligence , volume=

    Spatio-temporal graph neural point process for traffic congestion event prediction , author=. Proceedings of the AAAI conference on artificial intelligence , volume=

  9. [9]

    Proceedings of the AAAI conference on artificial intelligence , volume=

    Spatio-temporal meta-graph learning for traffic forecasting , author=. Proceedings of the AAAI conference on artificial intelligence , volume=

  10. [10]

    Expert Systems with Applications , year=

    Spatio-Temporal Graph Neural Network for Urban Spaces: Interpolating Citywide Traffic Volume , author=. Expert Systems with Applications , year=

  11. [11]

    IEEE Transactions on Intelligent Transportation Systems , volume=

    Graph neural networks for intelligent transportation systems: A survey , author=. IEEE Transactions on Intelligent Transportation Systems , volume=

  12. [12]

    IEEE Transactions on Intelligent Transportation Systems , volume=

    PFNet: Large-scale traffic forecasting with progressive spatio-temporal fusion , author=. IEEE Transactions on Intelligent Transportation Systems , volume=

  13. [13]

    IEEE Transactions on Knowledge and Data Engineering , volume=

    Spatio-temporal joint graph convolutional networks for traffic forecasting , author=. IEEE Transactions on Knowledge and Data Engineering , volume=

  14. [14]

    Information Fusion , volume=

    Enhancement of traffic forecasting through graph neural network-based information fusion techniques , author=. Information Fusion , volume=

  15. [15]

    Knowledge-based systems , volume=

    Dynamic spatial aware graph transformer for spatiotemporal traffic flow forecasting , author=. Knowledge-based systems , volume=

  16. [16]

    Knowledge-Based Systems , volume=

    Dynamic graph convolutional recurrent imputation network for spatiotemporal traffic missing data , author=. Knowledge-Based Systems , volume=

  17. [17]

    Proceedings of the 31st ACM SIGKDD Conference on Knowledge Discovery and Data Mining V

    Efficient large-scale traffic forecasting with transformers: A spatial data management perspective , author=. Proceedings of the 31st ACM SIGKDD Conference on Knowledge Discovery and Data Mining V. 1 , pages=

  18. [18]

    Computer-Aided Civil and Infrastructure Engineering , volume=

    A deep marked graph process model for citywide traffic congestion forecasting , author=. Computer-Aided Civil and Infrastructure Engineering , volume=

  19. [19]

    Proceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining , pages=

    Irregular traffic time series forecasting based on asynchronous spatio-temporal graph convolutional networks , author=. Proceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining , pages=

  20. [20]

    Proceedings of the 29th ACM SIGKDD conference on knowledge discovery and data mining , pages=

    Transferable graph structure learning for graph-based traffic forecasting across cities , author=. Proceedings of the 29th ACM SIGKDD conference on knowledge discovery and data mining , pages=

  21. [21]

    IEEE Transactions on Intelligent Transportation Systems , volume=

    MG-TAR: Multi-view graph convolutional networks for traffic accident risk prediction , author=. IEEE Transactions on Intelligent Transportation Systems , volume=

  22. [22]

    Advances in neural information processing systems , volume=

    Graph neural networks for road safety modeling: Datasets and evaluations for accident analysis , author=. Advances in neural information processing systems , volume=

  23. [23]

    Advances in Neural Information Processing Systems , volume=

    Largest: A benchmark dataset for large-scale traffic forecasting , author=. Advances in Neural Information Processing Systems , volume=

  24. [24]

    Advances in Neural Information Processing Systems , volume=

    Satformer: Accurate and robust traffic data estimation for satellite networks , author=. Advances in Neural Information Processing Systems , volume=

  25. [25]

    Proceedings of the ACM web conference 2024 , pages=

    Unveiling delay effects in traffic forecasting: a perspective from spatial-temporal delay differential equations , author=. Proceedings of the ACM web conference 2024 , pages=

  26. [26]

    Advances in neural information processing systems , volume=

    Dyffusion: A dynamics-informed diffusion model for spatiotemporal forecasting , author=. Advances in neural information processing systems , volume=

  27. [27]

    Advances in Neural Information Processing Systems , volume=

    Space-time continuous pde forecasting using equivariant neural fields , author=. Advances in Neural Information Processing Systems , volume=

  28. [28]

    2025 IEEE 64th Conference on Decision and Control (CDC) , pages=

    Online traffic density estimation using physics-informed neural networks , author=. 2025 IEEE 64th Conference on Decision and Control (CDC) , pages=

  29. [29]

    Computer Methods in Applied Mechanics and Engineering , volume=

    PINN-FORM: A new physics-informed neural network for reliability analysis with partial differential equation , author=. Computer Methods in Applied Mechanics and Engineering , volume=

  30. [30]

    IEEE Internet of Things Journal , year=

    Embedding Fluid Dynamics into Neural Networks: Towards Interpretable Traffic Flow Prediction via Physics-Informed Learning , author=. IEEE Internet of Things Journal , year=

  31. [31]

    Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence , year=

    Spatio-temporal graph convolutional networks: A deep learning framework for traffic forecasting , author=. Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence , year=

  32. [32]

    International Conference on Learning Representations , year=

    Diffusion convolutional recurrent neural network: Data-driven traffic forecasting , author=. International Conference on Learning Representations , year=

  33. [33]

    Proceedings of the AAAI conference on artificial intelligence , volume=

    Pdformer: Propagation delay-aware dynamic long-range transformer for traffic flow prediction , author=. Proceedings of the AAAI conference on artificial intelligence , volume=

  34. [34]

    Advances in neural information processing systems , volume=

    GPT-ST: generative pre-training of spatio-temporal graph neural networks , author=. Advances in neural information processing systems , volume=

  35. [35]

    Proceedings of the 30th ACM SIGKDD conference on knowledge discovery and data mining , pages=

    Unist: A prompt-empowered universal model for urban spatio-temporal prediction , author=. Proceedings of the 30th ACM SIGKDD conference on knowledge discovery and data mining , pages=

  36. [36]

    Proceedings of the VLDB Endowment , volume=

    Bigst: Linear complexity spatio-temporal graph neural network for traffic forecasting on large-scale road networks , author=. Proceedings of the VLDB Endowment , volume=

  37. [37]

    Proceedings of the VLDB Endowment , year=

    GraphSparseNet: A novel method for large scale traffic flow prediction , author=. Proceedings of the VLDB Endowment , year=

  38. [38]

    Proceedings of the 26th ACM SIGSPATIAL international conference on advances in geographic information systems , pages=

    Bike flow prediction with multi-graph convolutional networks , author=. Proceedings of the 26th ACM SIGSPATIAL international conference on advances in geographic information systems , pages=

  39. [39]

    2021 IEEE 37th International Conference on Data Engineering (ICDE) , pages=

    SOUP: A fleet management system for passenger demand prediction and competitive taxi supply , author=. 2021 IEEE 37th International Conference on Data Engineering (ICDE) , pages=

  40. [40]

    Proceedings of the 23rd SIGSPATIAL international conference on advances in geographic information systems , pages=

    Traffic prediction in a bike-sharing system , author=. Proceedings of the 23rd SIGSPATIAL international conference on advances in geographic information systems , pages=

  41. [41]

    Proceedings of the AAAI conference on artificial intelligence , volume=

    Preference-aware task assignment in on-demand taxi dispatching: An online stable matching approach , author=. Proceedings of the AAAI conference on artificial intelligence , volume=

  42. [42]

    Advances in neural information processing systems , volume=

    Support vector regression machines , author=. Advances in neural information processing systems , volume=

  43. [43]

    arXiv preprint arXiv:1909.02902 , pages=

    Acfm: A dynamic spatial-temporal network for traffic prediction , author=. arXiv preprint arXiv:1909.02902 , pages=

  44. [44]

    IEEE transactions on knowledge and data engineering , volume=

    Fine-grained urban flow inference , author=. IEEE transactions on knowledge and data engineering , volume=

  45. [45]

    arXiv preprint arXiv:1803.01254 , volume=

    Modeling spatial-temporal dynamics for traffic prediction , author=. arXiv preprint arXiv:1803.01254 , volume=

  46. [46]

    Proceedings of the AAAI conference on artificial intelligence , volume=

    Deep spatio-temporal residual networks for citywide crowd flows prediction , author=. Proceedings of the AAAI conference on artificial intelligence , volume=

  47. [47]

    Proceedings of the 24th ACM SIGSPATIAL international conference on advances in geographic information systems , pages=

    DNN-based prediction model for spatio-temporal data , author=. Proceedings of the 24th ACM SIGSPATIAL international conference on advances in geographic information systems , pages=

  48. [48]

    IEEE Transactions on Knowledge and Data Engineering , volume=

    A survey on modern deep neural network for traffic prediction: Trends, methods and challenges , author=. IEEE Transactions on Knowledge and Data Engineering , volume=

  49. [49]

    Advances in neural information processing systems , volume=

    Convolutional LSTM network: A machine learning approach for precipitation nowcasting , author=. Advances in neural information processing systems , volume=

  50. [50]

    Proceedings of the 25th ACM SIGKDD international conference on knowledge discovery & data mining , pages=

    Co-prediction of multiple transportation demands based on deep spatio-temporal neural network , author=. Proceedings of the 25th ACM SIGKDD international conference on knowledge discovery & data mining , pages=

  51. [51]

    , author=

    Periodic-CRN: A convolutional recurrent model for crowd density prediction with recurring periodic patterns. , author=. Ijcai , volume=

  52. [52]

    IEEE Transactions on Knowledge and Data Engineering , volume=

    Learning dynamics and heterogeneity of spatial-temporal graph data for traffic forecasting , author=. IEEE Transactions on Knowledge and Data Engineering , volume=

  53. [53]

    IEEE Transactions on Knowledge and Data Engineering , volume=

    Spatio-temporal meta learning for urban traffic prediction , author=. IEEE Transactions on Knowledge and Data Engineering , volume=

  54. [54]

    IEEE Transactions on Knowledge and Data Engineering , volume=

    TTPNet: A neural network for travel time prediction based on tensor decomposition and graph embedding , author=. IEEE Transactions on Knowledge and Data Engineering , volume=

  55. [55]

    IEEE Transactions on Knowledge and Data Engineering , volume=

    Predicting citywide crowd flows in irregular regions using multi-view graph convolutional networks , author=. IEEE Transactions on Knowledge and Data Engineering , volume=

  56. [56]

    IEEE transactions on cybernetics , volume=

    Learning graph embedding with adversarial training methods , author=. IEEE transactions on cybernetics , volume=

  57. [57]

    Semi-Supervised Classification with Graph Convolutional Networks

    Semi-supervised classification with graph convolutional networks , author=. arXiv preprint arXiv:1609.02907 , year=

  58. [58]

    Proceedings of the AAAI conference on artificial intelligence , volume=

    Attention based spatial-temporal graph convolutional networks for traffic flow forecasting , author=. Proceedings of the AAAI conference on artificial intelligence , volume=

  59. [59]

    Advances in neural information processing systems , volume=

    Adaptive graph convolutional recurrent network for traffic forecasting , author=. Advances in neural information processing systems , volume=

  60. [60]

    , author=

    LSGCN: Long short-term traffic prediction with graph convolutional networks. , author=. IJCAI , volume=

  61. [61]

    Proceedings of the 28th International Joint Conference on Artificial Intelligence , year=

    Graph wavenet for deep spatial-temporal graph modeling , author=. Proceedings of the 28th International Joint Conference on Artificial Intelligence , year=

  62. [62]

    Proceedings of the 27th ACM SIGKDD conference on knowledge discovery & data mining , pages=

    Spatial-temporal graph ode networks for traffic flow forecasting , author=. Proceedings of the 27th ACM SIGKDD conference on knowledge discovery & data mining , pages=

  63. [63]

    Proceedings of the AAAI conference on artificial intelligence , volume=

    Spatial-temporal fusion graph neural networks for traffic flow forecasting , author=. Proceedings of the AAAI conference on artificial intelligence , volume=

  64. [64]

    Proceedings of the AAAI conference on artificial intelligence , volume=

    Spatial-temporal synchronous graph convolutional networks: A new framework for spatial-temporal network data forecasting , author=. Proceedings of the AAAI conference on artificial intelligence , volume=

  65. [65]

    Proceedings of the 26th ACM SIGKDD international conference on knowledge discovery & data mining , pages=

    Connecting the dots: Multivariate time series forecasting with graph neural networks , author=. Proceedings of the 26th ACM SIGKDD international conference on knowledge discovery & data mining , pages=

  66. [66]

    Proceedings of the 27th ACM SIGKDD conference on knowledge discovery & data mining , pages=

    Dynamic and multi-faceted spatio-temporal deep learning for traffic speed forecasting , author=. Proceedings of the 27th ACM SIGKDD conference on knowledge discovery & data mining , pages=

  67. [67]

    Proceedings of the AAAI conference on artificial intelligence , volume=

    Spatio-temporal pivotal graph neural networks for traffic flow forecasting , author=. Proceedings of the AAAI conference on artificial intelligence , volume=

  68. [68]

    IEEE Transactions on Neural Networks and Learning Systems , volume=

    Bidirectional spatial-temporal adaptive transformer for urban traffic flow forecasting , author=. IEEE Transactions on Neural Networks and Learning Systems , volume=

  69. [69]

    Proceedings of the 29th acm sigkdd conference on knowledge discovery and data mining , pages=

    Localised adaptive spatial-temporal graph neural network , author=. Proceedings of the 29th acm sigkdd conference on knowledge discovery and data mining , pages=

  70. [70]

    Proceedings of the 31st ACM SIGKDD Conference on Knowledge Discovery and Data Mining V

    Graph odes and beyond: A comprehensive survey on integrating differential equations with graph neural networks , author=. Proceedings of the 31st ACM SIGKDD Conference on Knowledge Discovery and Data Mining V. 2 , pages=