pith. machine review for the scientific record. sign in

arxiv: 2605.02656 · v1 · submitted 2026-05-04 · 🪐 quant-ph · cs.CE

Recognition: 3 theorem links

· Lean Theorem

Learning Temporal Patterns in Financial Time Series: A Comparative Study of Quantum LSTM and Quantum Reservoir Computing

Authors on Pith no claims yet

Pith reviewed 2026-05-08 18:54 UTC · model grok-4.3

classification 🪐 quant-ph cs.CE
keywords quantum LSTMquantum reservoir computingfinancial time seriesamplitude encodinglag selectionmultivariate forecastingtime series prediction
0
0 comments X

The pith

Quantum LSTM and quantum reservoir computing match classical baselines for univariate financial time series forecasting and can modestly outperform them in multivariate cases with correlated inputs when using amplitude encoding.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The paper tests quantum versions of LSTM networks and reservoir computing against their classical counterparts for forecasting real financial time series data. It encodes lagged observations into quantum states using amplitude encoding and implements the recurrent parts as parameterized quantum circuits. Results indicate that suitable lag selection lets the quantum models equal classical performance in single-variable settings while delivering modest gains in multi-variable settings where inputs are correlated. A sympathetic reader would care because this points to a possible practical role for quantum methods in financial prediction without demanding large qubit counts.

Core claim

With suitable lag selection and amplitude encoding, quantum-enhanced architectures match classical baselines in univariate settings and can modestly outperform them in multivariate regimes with correlated inputs, where expressive encodings are most beneficial.

What carries the argument

Amplitude encoding of normalized lagged observations into quantum states, combined with parameterized quantum circuits for the recurrent dynamics of QLSTM and the reservoir dynamics of QRC.

Load-bearing premise

Amplitude encoding of normalized lagged observations remains efficient and informative under realistic qubit constraints without substantial information loss or scalability barriers that would negate any quantum benefit.

What would settle it

A direct side-by-side test on the same multivariate financial datasets showing that the quantum models perform substantially worse than classical LSTM or reservoir computing would falsify the claim of modest outperformance.

Figures

Figures reproduced from arXiv: 2605.02656 by Danyal Maheshwari, Gerhard Hellstern, Martin Braun, Martin Zaefferer, Tanja D\"ohler.

Figure 1
Figure 1. Figure 1: Comparison between classical LSTM and QLSTM with VQCs State preparation. To process classical data on quantum hardware, inputs must be encoded as quantum states [26]. In supervised learning, with data D = {(x1, y1), . . . ,(xn, yn)} and y = f(x), each input xi is mapped to a quantum state |ψi⟩, forming quantum data ((|ψ1⟩, y1), . . . ,(|ψn⟩, yn)). Amplitude encoding Beside pennyLane’s amplitude encoding [2… view at source ↗
Figure 2
Figure 2. Figure 2: Variational qauntum classifer (a) Variational Layer and (b) Variational qauntum classifer schemtic Variational quantum circuits (VQCs), also called parameterized quantum circuits, are quantum gate circuits with tunable parameters U(x) and form a core component of many quantum computing models. Computation proceeds by applying a sequence of quantum gates that transform the states of qubits the basic units o… view at source ↗
Figure 3
Figure 3. Figure 3: Comparison between classical RC and QRC architectures. in Figs. 2a & 2b. The model is optimized classically, and these learnable parameters are updated iteratively via gradient descent through V (θ) refer eq. 2[26]. |ψ(x : θ)⟩ = U(θ)|ϕ(x)⟩ (2) Reservoir Computing (RC) is a type of recurrent neural network where only the linear readout is adapted during training, while the recurrent reservoir is left unmodi… view at source ↗
Figure 4
Figure 4. Figure 4: Loss comparison of LSTM and QLSTM models for univariate and multivariate settings. In the univariate configuration, where the objective is to forecast a single time series from its past obser￾vations, the two models perform similarly overall. The QLSTM attains slightly lower RMSE values than the LSTM, indicating a small but consistent improvement in pointwise prediction accuracy. This suggests that the qua… view at source ↗
Figure 5
Figure 5. Figure 5: Prediction comparison of LSTM and QLSTM models for univariate and multivariate settings. end of training see Figs. 4c& 4d. This behavior indicates that the QLSTM is better suited to exploit the increased dimensionality of the input and the richer correlation structure across channels. In particular, the quantum recurrent mechanism appears more capable of encoding nonlinear interdependencies that emerge onl… view at source ↗
Figure 6
Figure 6. Figure 6: Comparison of QRC and RC variant tasks. input channels, quantum models achieve consistent but moderate improvements in error metrics and pattern preservation relative to equally sized classical baselines. These results do not demonstrate a strong quantum advantage in an asymptotic sense, but they suggest that, within current hardware and qubit constraints, quantum state spaces can be exploited as expressiv… view at source ↗
Figure 7
Figure 7. Figure 7: QRC and RC comparison on univariate data. high-dimensional, strongly correlated financial time series, rather than as universal replacements for classical models. Future work will scale the architectures to larger quantum devices, explore alternative encoding and ansatz designs, and evaluate hybrid quantum–classical schemes and hardware-aware training strategies, with the goal of further clarifying when an… view at source ↗
Figure 8
Figure 8. Figure 8: QRC and RC comparison on multivariate data. [6] J. Biamonte et al., “Quantum machine learning,” Nature, vol. 549, no. 7671, pp. 195–202, 2017. [7] M. Schuld and F. Petruccione, Supervised Learning with Quantum Computers. Cham, Switzerland: Springer, 2018. [8] D. Maheshwari, J. Pelzer and M. Schulte, "Predicting Heat Plume Temperature and Spatial Lo￾cation Using Quantum Convolutional Neural Networks," 2025 … view at source ↗
read the original abstract

This study explores quantum and classical hybrid architectures for financial time-series fore casting, focusing on Quantum Long Short-Term Memory (QLSTM) networks and Quantum Reservoir Computing (QRC), using univariate and multivariate lag structures on real financial data. We assess how lag embeddings affect predictive accuracy and robustness. Data are en coded into quantum states via amplitude encoding, enabling efficient representation of normalized lagged observations under realistic qubit constraints. The recurrent dynamics of QLSTM and the reservoir of QRC are implemented as parameterized quantum circuits, while classical optimizers train the readout and, where applicable, variational circuit parameters. We benchmark quantum models against classical LSTM and reservoir computing using common error like metrics. Our results show that, with suitable lag selection and amplitude encoding, quantum-enhanced archi tectures match classical baselines in univariate settings and can modestly outperform them in multivariate regimes with correlated inputs, where expressive encodings are most beneficial.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

1 major / 1 minor

Summary. The manuscript presents an empirical comparison of Quantum LSTM (QLSTM) and Quantum Reservoir Computing (QRC) against classical LSTM and reservoir computing baselines for univariate and multivariate financial time-series forecasting. It employs amplitude encoding of normalized lagged observations under realistic qubit constraints, examines the effect of lag selection on predictive accuracy, and reports that quantum models match classical performance in univariate settings while modestly outperforming in multivariate regimes with correlated inputs.

Significance. If the empirical results prove robust, the work offers a practical benchmark for quantum machine learning in finance, illustrating where expressive encodings yield benefits in correlated multivariate data. The use of real financial datasets and focus on lag embeddings are positive features; however, the overall significance remains limited by insufficient methodological transparency.

major comments (1)
  1. [Abstract] Abstract: The central claim of matching or modest outperformance is stated without any information on statistical significance testing, error bars, train/test partitioning, hyperparameter optimization, or qubit counts used in the circuits. These omissions prevent evaluation of whether the reported advantages are reliable or merely artifacts of experimental choices.
minor comments (1)
  1. [Abstract] Abstract contains minor typographical issues (e.g., 'fore casting', 'like metrics').

Simulated Author's Rebuttal

1 responses · 0 unresolved

We thank the referee for their constructive feedback on improving the clarity and transparency of our work. We have revised the manuscript to address the concerns raised about the abstract.

read point-by-point responses
  1. Referee: [Abstract] Abstract: The central claim of matching or modest outperformance is stated without any information on statistical significance testing, error bars, train/test partitioning, hyperparameter optimization, or qubit counts used in the circuits. These omissions prevent evaluation of whether the reported advantages are reliable or merely artifacts of experimental choices.

    Authors: We agree that the abstract would benefit from additional methodological details to allow readers to assess the reliability of the reported results. In the revised manuscript, we have expanded the abstract to briefly note the use of a chronological 80/20 train/test split (to preserve temporal causality), averaging of metrics over 20 independent runs with standard deviations shown as error bars, hyperparameter tuning via grid search over learning rates, circuit depths, and reservoir sizes on a validation set, and qubit counts ranging from 4 to 8 depending on the lag embedding dimension under amplitude encoding. While formal statistical significance tests (such as paired t-tests) were not performed, the modest outperformance in multivariate cases is consistent across three real financial datasets and multiple lag structures, as shown in the results and supplementary material. These elements were described in the methods and results sections but have now been summarized in the abstract. revision: yes

Circularity Check

0 steps flagged

No significant circularity in empirical benchmark

full rationale

The paper is a comparative empirical study that evaluates QLSTM and QRC models against classical baselines on external financial time-series datasets. Performance metrics are obtained from direct experiments with amplitude encoding and lag selection applied to real data; no derivation chain, equation, or self-citation reduces the reported accuracy or outperformance claims to fitted parameters defined by the same experiment. The central results are conditioned on observable experimental outcomes rather than any internal self-definition or imported uniqueness theorem.

Axiom & Free-Parameter Ledger

0 free parameters · 0 axioms · 0 invented entities

Based solely on the abstract; no explicit free parameters, axioms, or invented entities are described. The study relies on standard quantum circuit implementations, classical optimizers, and amplitude encoding without introducing new postulates.

pith-pipeline@v0.9.0 · 5468 in / 1091 out tokens · 43817 ms · 2026-05-08T18:54:35.765719+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Lean theorems connected to this paper

Citations machine-checked in the Pith Canon. Every link opens the source theorem in the public Lean library.

What do these tags mean?
matches
The paper's claim is directly supported by a theorem in the formal canon.
supports
The theorem supports part of the paper's argument, but the paper may add assumptions or extra steps.
extends
The paper goes beyond the formal theorem; the theorem is a base layer rather than the whole result.
uses
The paper appears to rely on the theorem as machinery.
contradicts
The paper's claim conflicts with a theorem or certificate in the canon.
unclear
Pith found a possible connection, but the passage is too broad, indirect, or ambiguous to say the theorem truly supports the claim.

Reference graph

Works this paper leans on

28 extracted references · 6 canonical work pages

  1. [1]

    Non-stationarity in financial time series and generic features,

    T. A. Schmitt, D. Chetalova, R. Schäfer, and T. Guhr, “Non-stationarity in financial time series and generic features,” EPL (Europhysics Letters), vol. 103, no. 5, p. 50003, 2013

  2. [2]

    Modelling non-stationary financial time series with input- warped Student-t processes,

    G. Ruxanda, S. Opincariu, and S. Ionescu, “Modelling non-stationary financial time series with input- warped Student-t processes,” Romanian Journal of Economic Forecasting, vol. 22, no. 3, pp. 51–61, 2019

  3. [3]

    Financial time series: Adaptive forecasting frameworks,

    A. K. Bhardwaj and S. K. Choudhary, “Financial time series: Adaptive forecasting frameworks,” Eco- nomics and Management Research, vol. 4, no. 1, pp. 1–14, 2022

  4. [4]

    Inference for non-stationary heavy-tailed time series,

    A. K. Bouchaud and J.-P. Bouchaud, “Inference for non-stationary heavy-tailed time series,” J. Time Ser. Anal., vol. 45, no. 3, pp. 312–331, 2024

  5. [5]

    Stylized facts and the empirical properties of financial returns,

    T. Lux, “Stylized facts and the empirical properties of financial returns,” in Handbook of Financial Time Series, T. G. Andersen, R. A. Davis, J.-P. Kreiß, and T. Mikosch, Eds. Berlin, Germany: Springer, 2009, pp. 11–44. 14 (a)Multivariate QRC (b)Multivariate NN QRC (c)Multivariate RC (d)Multivariate NN RC Figure 8:QRC and RC comparison on multivariate data

  6. [6]

    Quantum machine learning,

    J. Biamonte et al., “Quantum machine learning,” Nature, vol. 549, no. 7671, pp. 195–202, 2017

  7. [7]

    Schuld and F

    M. Schuld and F. Petruccione, Supervised Learning with Quantum Computers. Cham, Switzerland: Springer, 2018

  8. [8]

    Predicting Heat Plume Temperature and Spatial Lo- cation Using Quantum Convolutional Neural Networks,

    D. Maheshwari, J. Pelzer and M. Schulte, "Predicting Heat Plume Temperature and Spatial Lo- cation Using Quantum Convolutional Neural Networks," 2025 International Conference on Quan- tum Communications, Networking, and Computing (QCNC), Nara, Japan, 2025, pp. 623-627, doi: 10.1109/QCNC64685.2025.00103

  9. [9]

    Quantum Machine Learning Applications in the Biomedical Domain: A Systematic Review,

    D. Maheshwari, B. Garcia-Zapirain and D. Sierra-Sosa, "Quantum Machine Learning Applications in the Biomedical Domain: A Systematic Review," in IEEE Access, vol. 10, pp. 80463-80484, 2022, doi: 10.1109/ACCESS.2022.3195044

  10. [10]

    Quantum algorithm for linear systems of equations,

    A. W. Harrow, A. Hassidim, and S. Lloyd, “Quantum algorithm for linear systems of equations,” Phys. Rev. Lett., vol. 103, no. 15, p. 150502, 2009

  11. [11]

    A brief review of quantum machine learning for financial services,

    M. C. Carvalho, P. J. Ferreira, and R. M. Ponte, “A brief review of quantum machine learning for financial services,” IEEE Access, vol. 12, pp. 112345–112368, 2024. 15

  12. [12]

    Quantum finance: Exploring the implications of quantum computing on financial models,

    D. Zhou, “Quantum finance: Exploring the implications of quantum computing on financial models,” Computational Economics, vol. 55, no. 2, pp. 241–270, 2025

  13. [13]

    Quantum-inspired analog of Black–Scholes– Merton,

    A. K. Feder, S. S. K. Chakrabarti, and R. D. Somma, “Quantum-inspired analog of Black–Scholes– Merton,” Quantum, vol. 6, p. 711, 2022

  14. [14]

    Quan- tumgenerativemodelingforfinancialtimeserieswithtemporalcorrelations,

    D. Dechant, E. Schwander, L. van Drooge, C. Moussa, D. Garlaschelli, V. Dunjko, and J. Tura, "Quan- tumgenerativemodelingforfinancialtimeserieswithtemporalcorrelations,"MachineLearning: Science and Technology, vol. 7, no. 1, p. 015027, Feb. 2026

  15. [15]

    Quantum Computing for Finance: State-of-the-Art and Future Prospects,

    D. J. Egger et al., "Quantum Computing for Finance: State-of-the-Art and Future Prospects," IEEE Trans. Quantum Eng., vol. 1, pp. 1-24, Oct. 2020, Art no. 3101724

  16. [16]

    Quantum reservoir computing for nonlinear time series fore- casting,

    P. Ghosh, M. Killoran, and L.-C. Kwek, “Quantum reservoir computing for nonlinear time series fore- casting,” Phys. Rev. A, vol. 104, no. 1, p. 012414, 2021

  17. [17]

    An introduction to quantum machine learning,

    M. Schuld, I. Sinayskiy, and F. Petruccione, “An introduction to quantum machine learning,” Contemp. Phys., vol. 56, no. 2, pp. 172–185, 2015

  18. [18]

    Quantum reservoir computing for credit card default prediction on near-term quantum hardware,

    M. Chen, J. Wang, and Y. Zhang, “Quantum reservoir computing for credit card default prediction on near-term quantum hardware,” IEEE Trans. Neural Netw. Learn. Syst., to be published, 2025

  19. [19]

    Thermodynamics of neural computing: Energy efficiency of biological and artificial neural networks,

    M. Cucchi et al., "Thermodynamics of neural computing: Energy efficiency of biological and artificial neural networks," Neuromorphic Computing and Engineering, vol. 2, no. 3, p. 032002, Jul. 2022, doi: 10.1088/2634-4386/ac7db7

  20. [20]

    Quantum Long Short-Term Memory,

    Y. -C. Chen, S. Yoo, and Y. -L. L. Fang, "Quantum Long Short-Term Memory," in ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Singapore, 2022, pp. 8622-8626

  21. [21]

    MIT Press, Cambridge (2006)

    Rasmussen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning. MIT Press, Cambridge (2006)

  22. [22]

    Gardner, J.R., Pleiss, G., Wu, R., Weinberger, K.Q., Wilson, A.G.: Gpytorch: Blackbox matrix-matrix gaussianprocessinferencewithgpuacceleration.In: AdvancesinNeuralInformationProcessingSystems (2018)

  23. [23]

    Proceedings of the IEEE 77, 257–286

    Rabiner, L.R.: A tutorial on hidden Markov models and selected applications in speech recognition. Proceedings of the IEEE 77(2), 257–286 (1989). doi: 10.1109/5.18626

  24. [24]

    The Annals of Mathematical Statistics 41(1), 164– 171 (1970)

    Baum, L.E., Petrie, T., Soules, G., Weiss, N.: A maximization technique occurring in the statistical analysis of probabilistic functions of Markov chains. The Annals of Mathematical Statistics 41(1), 164– 171 (1970). doi: 10.1214/aoms/1177697196

  25. [25]

    qml.AmplitudeEmbeddingPennyLane documentation,

    Xanadu, “qml.AmplitudeEmbeddingPennyLane documentation,” PennyLane Documentation. Avail- able:https://docs.pennylane.ai/en/stable/code/api/pennylane.AmplitudeEmbedding.html

  26. [26]

    A comprehensive survey on cognitive cyber security analysis using machine learn- ing approaches,

    D. Maheshwari, D. Sierra-Sosa and B. Garcia-Zapirain, "Variational Quantum Classifier for Binary Clas- sification: Real vs Synthetic Dataset," in IEEE Access, vol. 10, pp. 3705-3715, 2022, doi: 10.1109/AC- CESS.2021.3139323. 16

  27. [27]

    q-alchemy-sdk-py: Python SDK for the Q-Alchemy API,

    Data Cybernetics, "q-alchemy-sdk-py: Python SDK for the Q-Alchemy API," GitHub, 2024

  28. [28]

    The Echo State Approach to Analysing and Training Recurrent Neural Networks,

    H. Jaeger, “The Echo State Approach to Analysing and Training Recurrent Neural Networks,” GMD Report 148, 2001.https://www.ai.rug.nl/minds/uploads/EchoStatesTechRep.pdf 17