pith. machine review for the scientific record. sign in

arxiv: 2605.04200 · v1 · submitted 2026-05-05 · 🧬 q-bio.NC

Recognition: unknown

Neural Manifolds as Crystallized Embeddings: A Synthesis of the Free Energy Principle, Generalized Synchronization, and Hebbian Plasticity

Vikas N. O'Reilly-Shah

Pith reviewed 2026-05-08 17:28 UTC · model grok-4.3

classification 🧬 q-bio.NC
keywords neural manifoldsfree energy principlegeneralized synchronizationHebbian plasticitycontinuous attractor networksreservoir computingdevelopmental emergencesensory embedding
0
0 comments X

The pith

The geometry predicted by the free energy principle arises from generalized synchronization in recurrent circuits and crystallizes via Hebbian plasticity.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The paper tries to show that the structured geometry of neural representations does not have to be built in by a top-down Bayesian algorithm. A simple contractive recurrent network can synchronize to the patterns in its sensory inputs and thereby embed the underlying manifold in its activity patterns. Hebbian plasticity can then turn those activity correlations into permanent connections, allowing the manifold to persist even without ongoing input. This account treats head-direction cells, grid cells, and visual feature maps as outcomes of development rather than genetic presets. A reader would care if this bottom-up route really works, because it replaces the need for an explicit inference engine with ordinary dynamical and learning rules.

Core claim

A contractive recurrent circuit driven by structured sensory input can synchronize to the driving dynamics and embed the low-dimensional sensory manifold into neural state space. Hebbian plasticity acting on the correlations generated by this synchronization may crystallize the embedded manifold into recurrent connectivity, yielding an autonomous continuous attractor network. On this view, mature head-direction, grid-cell, and stimulus-driven visual manifolds are developmental products of dynamical contraction, generalized synchronization, and correlation-based plasticity.

What carries the argument

Generalized synchronization in a contractive recurrent circuit, which embeds the sensory manifold into neural state space under generic reservoir-computing conditions.

If this is right

  • The free energy principle geometry need not be imposed by explicit Bayesian neural calculus or Taylor expansions.
  • Mature manifolds develop through interaction of dynamical contraction, generalized synchronization, and Hebbian plasticity.
  • Continuous attractor networks can form when the Hebbian fixed point exists and maintains embedding quality.
  • Attractor geometry depends on input statistics and shows dimensional thresholds for topological recovery.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • If the Hebbian fixed point does not preserve embedding quality, additional stabilizing mechanisms such as homeostatic plasticity may be required to maintain the manifold.
  • This synthesis suggests reservoir-style embeddings could explain how other structured representations emerge across sensory and cognitive domains.
  • Altering input statistics during a critical developmental window should measurably reshape later attractor geometry.

Load-bearing premise

Hebbian plasticity reaches a stable fixed point that preserves the embedding quality of the synchronization manifold after external drive is removed.

What would settle it

An observation that Hebbian blockade during development prevents formation of stable head-direction or grid-cell attractors, or that manifold stability collapses after input removal in a manner inconsistent with preserved embedding.

read the original abstract

The free energy principle casts perception as variational inference, but its biological implementation remains underspecified. In particular, the generalized-coordinate formalism should not be read as a literal claim that neurons compute arbitrary Taylor expansions. This paper argues that generalized synchronization provides the missing bottom-up mechanism. A contractive recurrent circuit driven by structured sensory input can synchronize to the driving dynamics. Under generic embedding conditions developed in the reservoir-computing literature, the resulting synchronization map can embed the low-dimensional sensory manifold into neural state space. Thus, the geometry predicted by the free energy principle need not be imposed from above by an explicitly Bayesian neural calculus; it can arise from ordinary recurrent dynamics driven by the world. I then propose a developmental extension. Hebbian plasticity acting on the correlations generated by sensory-driven synchronization may crystallize the embedded manifold into recurrent connectivity, yielding an autonomous continuous attractor network when the required fixed point exists. On this view, mature head-direction, grid-cell, and stimulus-driven visual manifolds are not genetically prespecified templates, but developmental products of three interacting processes: dynamical contraction, generalized synchronization, and correlation-based plasticity. The synthesis links the free energy principle, reservoir-computing embedding theorems, and contraction-theoretic models of Hebbian recurrent networks. It also yields testable predictions about dimensional thresholds for topological recovery, developmental sensitivity to plasticity, and the dependence of attractor geometry on input statistics. The central open problem is whether the Hebbian fixed point exists and preserves the embedding quality of the synchronization manifold.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

1 major / 2 minor

Summary. The manuscript synthesizes the free energy principle with generalized synchronization in contractive recurrent circuits and Hebbian plasticity. It claims that sensory-driven synchronization can embed low-dimensional manifolds into neural state space under generic reservoir-computing conditions, so that FEP geometry arises bottom-up from ordinary dynamics rather than explicit Bayesian computation. A developmental extension proposes that correlation-based plasticity crystallizes these embeddings into autonomous continuous attractor networks (e.g., head-direction, grid-cell, and visual manifolds) when a Hebbian fixed point exists; the paper explicitly flags the existence and embedding-preserving property of this fixed point as the central open problem.

Significance. If the Hebbian fixed-point step can be placed on a firmer footing, the synthesis would supply a mechanistic, bottom-up route from recurrent dynamics to the low-dimensional geometries invoked by the free energy principle, linking reservoir-computing embedding theorems with contraction-theoretic Hebbian models. It generates concrete, testable predictions concerning dimensional thresholds for topological recovery, developmental sensitivity to plasticity, and dependence of attractor geometry on input statistics.

major comments (1)
  1. [Abstract and Hebbian crystallization section] Abstract (final paragraph) and the section developing the Hebbian crystallization proposal: the central developmental claim—that correlation-based plasticity on synchronization-generated correlations reaches a fixed point that crystallizes the embedded manifold into an autonomous attractor whose geometry matches the original sensory manifold—rests on an assumption the manuscript itself identifies as open. No existence conditions (contraction rates, spectral radius of the Hebbian operator, correlation decay timescales) or sketch showing inheritance of the embedding map are supplied, rendering the account of head-direction, grid-cell, and visual manifolds conjectural rather than a derived consequence of the three interacting processes.
minor comments (2)
  1. [Introduction and synchronization-embedding paragraphs] The phrase 'generic embedding conditions developed in the reservoir-computing literature' is invoked repeatedly but without citation to the specific theorems (e.g., the precise statements of embedding dimension or observability conditions) that are being imported; adding these references would allow readers to assess the scope of applicability.
  2. [Methods/notation sections] Notation for the synchronization map and the subsequent Hebbian update rule could be made more explicit (e.g., distinguishing the driving signal, the reservoir state, and the plastic weight matrix) to facilitate future attempts to close the open fixed-point problem.

Simulated Author's Rebuttal

1 responses · 1 unresolved

We thank the referee for the constructive and precise review. The major comment correctly identifies a key limitation in our presentation of the Hebbian crystallization proposal, which we address point by point below.

read point-by-point responses
  1. Referee: [Abstract and Hebbian crystallization section] Abstract (final paragraph) and the section developing the Hebbian crystallization proposal: the central developmental claim—that correlation-based plasticity on synchronization-generated correlations reaches a fixed point that crystallizes the embedded manifold into an autonomous attractor whose geometry matches the original sensory manifold—rests on an assumption the manuscript itself identifies as open. No existence conditions (contraction rates, spectral radius of the Hebbian operator, correlation decay timescales) or sketch showing inheritance of the embedding map are supplied, rendering the account of head-direction, grid-cell, and visual manifolds conjectural rather than a derived consequence of the three interacting processes.

    Authors: We agree that the developmental claim is conjectural and rests on the unresolved existence of a Hebbian fixed point that preserves the synchronization embedding. The manuscript already flags this explicitly as the central open problem. No existence conditions or inheritance sketch are supplied because these questions lie beyond current results in contraction-theoretic Hebbian models and remain open. We will revise the abstract and the Hebbian crystallization section to state more explicitly that the proposed accounts of head-direction, grid-cell, and visual manifolds are hypotheses contingent on resolution of this fixed-point issue, rather than direct derivations from the three processes. revision: partial

standing simulated objections not resolved
  • Existence conditions for the Hebbian fixed point (contraction rates, spectral radius of the Hebbian operator, correlation decay timescales) and a sketch demonstrating inheritance of the embedding map under Hebbian plasticity.

Circularity Check

0 steps flagged

No circularity; key developmental link left explicitly open rather than reduced by construction.

full rationale

The manuscript states its central developmental claim conditionally ('when the required fixed point exists') and flags the Hebbian fixed-point existence plus embedding preservation as 'the central open problem' without deriving it or smuggling it via self-citation, ansatz, or renaming. The synchronization-embedding step is attributed to external reservoir-computing literature, not internal fits or self-definitions. No equations or steps in the provided text reduce a claimed prediction to its own inputs by construction. The synthesis therefore remains non-circular, though incomplete on the flagged point.

Axiom & Free-Parameter Ledger

0 free parameters · 2 axioms · 0 invented entities

The proposal rests on two imported domain assumptions from reservoir computing and contraction theory plus one ad-hoc developmental step whose validity is left open; no free parameters or new entities are introduced.

axioms (2)
  • domain assumption Contractive recurrent circuits driven by structured sensory input synchronize to the driving dynamics and embed the low-dimensional sensory manifold into neural state space under generic conditions.
    This supplies the bottom-up mechanism that replaces explicit Bayesian computation in the free energy principle.
  • ad hoc to paper Hebbian plasticity on the correlations generated by synchronization can reach a fixed point that crystallizes the embedded manifold into an autonomous continuous attractor network.
    This is the proposed developmental extension; the paper itself identifies its existence and embedding-preserving property as the central open problem.

pith-pipeline@v0.9.0 · 5579 in / 1609 out tokens · 35702 ms · 2026-05-08T17:28:17.214383+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Reference graph

Works this paper leans on

54 extracted references · 4 canonical work pages · 1 internal anchor

  1. [1]

    Morales, Evelyn Sander, and Paul So

    Ernest Barreto, Kreˇ simir Josi´ c, Carlos J. Morales, Evelyn Sander, and Paul So. The geometry of chaos synchronization.Chaos, 13:151–164, 2003

  2. [2]

    Continuous or discrete attractors in neural circuits? A self-organized switch at maximal entropy.arXiv preprint arXiv:0707.3511v1, 2007

    Alberto Bernacchia. Continuous or discrete attractors in neural circuits? A self-organized switch at maximal entropy.arXiv preprint arXiv:0707.3511v1, 2007. Submitted 24 July 2007

  3. [3]

    The interplay of plasticity and adaptation in neural circuits: a generative model.Frontiers in Synaptic Neuroscience, 6:26, 2014

    Alberto Bernacchia. The interplay of plasticity and adaptation in neural circuits: a generative model.Frontiers in Synaptic Neuroscience, 6:26, 2014

  4. [4]

    Einevoll

    Kosio Beshkov, Marianne Fyhn, Torkel Hafting, and Gaute T. Einevoll. Topological structure of population activity in mouse visual cortex encodes densely sampled stimulus rotations. iScience, 27(4):109370, 2024

  5. [5]

    Yoram Burak and Ila R. Fiete. Accurate path integration in continuous attractor network models of grid cells.PLOS Computational Biology, 5(2):e1000291, 2009

  6. [6]

    Reshaping reservoirs with unsupervised Hebbian adapta- tion.Nature Communications, 17:450, 2026

    Tanguy Cazalets and Joni Dambre. Reshaping reservoirs with unsupervised Hebbian adapta- tion.Nature Communications, 17:450, 2026. Published online 13 December 2025

  7. [7]

    Modeling and contractivity of neural-synaptic networks with Hebbian learning.Automatica, 164:111636, 2024

    Veronica Centorrino, Francesco Bullo, and Giovanni Russo. Modeling and contractivity of neural-synaptic networks with Hebbian learning.Automatica, 164:111636, 2024

  8. [8]

    Euclidean contractivity of neural networks with symmetric weights.IEEE Control Systems Letters, 7:1724–1729, 2023

    Veronica Centorrino, Anand Gokhale, Alexander Davydov, Giovanni Russo, and Francesco Bullo. Euclidean contractivity of neural networks with symmetric weights.IEEE Control Systems Letters, 7:1724–1729, 2023

  9. [9]

    The intrinsic attractor manifold and population dynamics of a canonical cognitive circuit across waking and sleep.Nature Neuroscience, 22(9):1512–1520, 2019

    Rishidev Chaudhuri, Berk Ger¸ cek, Biraj Pandey, Adrien Peyrache, and Ila Fiete. The intrinsic attractor manifold and population dynamics of a canonical cognitive circuit across waking and sleep.Nature Neuroscience, 22(9):1512–1520, 2019

  10. [10]

    Deyle and George Sugihara

    Ethan R. Deyle and George Sugihara. Generalized theorems for nonlinear state space recon- struction.PLOS One, 6(3):e18295, 2011

  11. [11]

    The role of oscillations in grid cells’ toroidal topology.PLOS Computational Biology, 21(1):e1012776, 2025

    Giovanni di Sarra, Siddharth Jha, and Yasser Roudi. The role of oscillations in grid cells’ toroidal topology.PLOS Computational Biology, 21(1):e1012776, 2025

  12. [12]

    Synapse-type-specific com- petitive Hebbian learning forms functional recurrent networks.PNAS, 121(25):e2305326121, 2024

    Samuel Eckmann, Edward James Young, and Julijana Gjorgjieva. Synapse-type-specific com- petitive Hebbian learning forms functional recurrent networks.PNAS, 121(25):e2305326121, 2024

  13. [13]

    Slowness and sparseness lead to place, head-direction, and spatial-view cells.PLOS Computational Biology, 3(8):e166, 2007

    Mathias Franzius, Henning Sprekeler, and Laurenz Wiskott. Slowness and sparseness lead to place, head-direction, and spatial-view cells.PLOS Computational Biology, 3(8):e166, 2007

  14. [14]

    The free-energy principle: a unified brain theory?Nature Reviews Neuroscience, 11(2):127–138, 2010

    Karl Friston. The free-energy principle: a unified brain theory?Nature Reviews Neuroscience, 11(2):127–138, 2010. 12

  15. [15]

    Perception and self-organized instability

    Karl Friston, Michael Breakspear, and Gustavo Deco. Perception and self-organized instability. Frontiers in Computational Neuroscience, 6:44, 2012

  16. [16]

    Karl Friston, Lancelot Da Costa, Dalton A. R. Sakthivadivel, Conor Heins, Grigorios A. Pavli- otis, Maxwell Ramstead, and Thomas Parr. Path integrals, particular kinds, and strange things.Physics of Life Reviews, 47:35–62, 2023

  17. [17]

    Stochastic chaos and Markov blankets.Entropy, 23(9):1220, 2021

    Karl Friston, Conor Heins, Kai Ueltzh¨ offer, Lancelot Da Costa, and Thomas Parr. Stochastic chaos and Markov blankets.Entropy, 23(9):1220, 2021

  18. [18]

    From pixels to planning: scale-free active inference.Frontiers in Network Physiology, 5:1521963, 2025

    Karl Friston, Conor Heins, Tim Verbelen, Lancelot Da Costa, Tommaso Salvatori, Dim- itrije Markovic, Alexander Tschantz, Magnus Koudahl, Christopher Buckley, and Thomas Parr. From pixels to planning: scale-free active inference.Frontiers in Network Physiology, 5:1521963, 2025

  19. [19]

    Generalised filtering.Mathe- matical Problems in Engineering, 2010:621670, 2010

    Karl Friston, Klaas Stephan, Baojuan Li, and Jean Daunizeau. Generalised filtering.Mathe- matical Problems in Engineering, 2010:621670, 2010

  20. [20]

    Gardner, Erik Hermansen, Marius Pachitariu, Yoram Burak, Nils A

    Richard J. Gardner, Erik Hermansen, Marius Pachitariu, Yoram Burak, Nils A. Baas, Ben- jamin A. Dunn, May-Britt Moser, and Edvard I. Moser. Toroidal topology of population activity in grid cells.Nature, 602(7895):123–128, 2022

  21. [21]

    Hart, and Juan-Pablo Ortega

    Lyudmila Grigoryeva, Allen G. Hart, and Juan-Pablo Ortega. Learning strange attractors with reservoir systems.Nonlinearity, 36(9):4674–4708, 2023

  22. [22]

    Springer, 1977

    Hermann Haken.Synergetics: An Introduction. Springer, 1977

  23. [23]

    Slaving principle revisited.Physica D, 97:95–103, 1996

    Hermann Haken. Slaving principle revisited.Physica D, 97:95–103, 1996

  24. [24]

    Hermann Haken, J. A. Scott Kelso, and Helmut Bunz. A theoretical model of phase transitions in human hand movements.Biological Cybernetics, 51(5):347–356, 1985

  25. [25]

    Allen G. Hart. Generalised synchronisations, embeddings, and approximations for continuous time reservoir computers.Physica D, 458:133956, 2024

  26. [26]

    Allen G. Hart. Generic and isometric embeddings in reservoir computers.Chaos, 35(11):111103, 2025

  27. [27]

    Hart, James Hook, and Jonathan Dawes

    Allen G. Hart, James Hook, and Jonathan Dawes. Embedding and approximation theorems for echo state networks.Neural Networks, 128:234–247, 2020

  28. [28]

    Hunt, Edward Ott, and James A

    Brian R. Hunt, Edward Ott, and James A. Yorke. Differentiable generalized synchronization of chaos.Physical Review E, 55(4):4029–4034, 1997

  29. [29]

    Mikail Khona and Ila R. Fiete. Attractor and integrator networks in the brain.Nature Reviews Neuroscience, 23(12):744–766, 2022

  30. [30]

    Kiebel, Jean Daunizeau, and Karl J

    Stefan J. Kiebel, Jean Daunizeau, and Karl J. Friston. Perception and hierarchical dynamics. Frontiers in Neuroinformatics, 3:20, 2009

  31. [31]

    Self-organized formation of topologically correct feature maps.Biological Cybernetics, 43(1):59–69, 1982

    Teuvo Kohonen. Self-organized formation of topologically correct feature maps.Biological Cybernetics, 43(1):59–69, 1982. 13

  32. [32]

    Leo Kozachkov, Mikael Lundqvist, Jean-Jacques Slotine, and Earl K. Miller. Achieving stable dynamics in neural circuits.PLOS Computational Biology, 16(8):e1007659, 2020

  33. [33]

    Zhixin Lu and Danielle S. Bassett. Invertible generalized synchronization: a putative mecha- nism for implicit learning in neural systems.Chaos, 30(6):063133, 2020

  34. [34]

    Reconstruction and cross-prediction in coupled map lattices using spatio-temporal embedding techniques.Physics Letters A, 247:145–160, 1998

    Solveig Ø rstavik and Jaroslav Stark. Reconstruction and cross-prediction in coupled map lattices using spatio-temporal embedding techniques.Physics Letters A, 247:145–160, 1998

  35. [35]

    Delay coordinate embedding as neuronally implemented information processing: The state space theory of consciousness.Journal of Consciousness Studies, 32(1– 2):127–159, 2025

    Vikas O’Reilly-Shah. Delay coordinate embedding as neuronally implemented information processing: The state space theory of consciousness.Journal of Consciousness Studies, 32(1– 2):127–159, 2025

  36. [36]

    O’Reilly-Shah

    Vikas N. O’Reilly-Shah. State space theory as a unifying framework for consciousness.Non- linear Dynamics, Psychology, and Life Sciences, 30:185–226, 2026

  37. [37]

    Embedding of Low-Dimensional Sensory Dynamics in Recurrent Networks: Implications for the Geometry of Neural Representation

    Vikas N. O’Reilly-Shah and Alessandro Maria Selvitella. Embedding of low-dimensional sen- sory dynamics in recurrent networks: Implications for the geometry of neural representation. arXiv preprint arXiv:2601.19019v2, 2026. Revised 10 April 2026; under review

  38. [38]

    The emergence of synchrony in networks of mutually inferring neurons.Scientific Reports, 9:6412, 2019

    Ensor Rafael Palacios, Takuya Isomura, Thomas Parr, and Karl Friston. The emergence of synchrony in networks of mutually inferring neurons.Scientific Reports, 9:6412, 2019

  39. [39]

    On Markov blankets and hierarchical self-organisation.Journal of Theoretical Biology, 486:110089, 2020

    Ensor Rafael Palacios, Adeel Razi, Thomas Parr, Michael Kirchhoff, and Karl Friston. On Markov blankets and hierarchical self-organisation.Journal of Theoretical Biology, 486:110089, 2020

  40. [40]

    Lacroix, Peter C

    Adrien Peyrache, Marie M. Lacroix, Peter C. Petersen, and Gy¨ orgy Buzs´ aki. Internally orga- nized mechanisms of the head direction sense.Nature Neuroscience, 18(4):569–575, 2015

  41. [41]

    Walker, David L

    Stephan Pohl, Edgar Y. Walker, David L. Barack, Jennifer Lee, Rachel N. Denison, Ned Block, Florent Meyniel, and Wei Ji Ma. Clarifying the conceptual dimensions of representation in neuroscience.Nature Reviews Neuroscience, 27:357–372, 2026

  42. [42]

    Maxwell J. D. Ramstead, Dalton A. R. Sakthivadivel, Conor Heins, Magnus Koudahl, Beren Millidge, Lancelot Da Costa, Brennan Klein, and Karl J. Friston. On Bayesian mechanics: a physics of and by beliefs.Interface Focus, 13(3):20220029, 2023

  43. [43]

    Dalton A. R. Sakthivadivel. Towards a geometry and analysis for Bayesian mechanics.arXiv preprint arXiv:2204.11900v1, 2022. Submitted 25 April 2022

  44. [44]

    How high is “high”? Re- thinking the roles of dimensionality in topological data analysis and manifold learning.arXiv preprint arXiv:2505.16879v1, 2025

    Hannah Sansford, Nick Whiteley, and Patrick Rubin-Delanchy. How high is “high”? Re- thinking the roles of dimensionality in topological data analysis and manifold learning.arXiv preprint arXiv:2505.16879v1, 2025. Submitted 22 May 2025

  45. [45]

    Diamond, and Sebastian Goldt

    Francesca Sch¨ onsberg, Davide Giana, Yukti Chopra, Mathew E. Diamond, and Sebastian Goldt. Diverse perceptual biases emerge from Hebbian plasticity in a recurrent neural network model.Neuron, 113(21):3673–3684.e6, 2025

  46. [46]

    Delay embeddings for forced systems

    Jaroslav Stark. Delay embeddings for forced systems. I. Deterministic forcing.Journal of Nonlinear Science, 9(3):255–332, 1999

  47. [47]

    Broomhead, Matthew E

    Jaroslav Stark, David S. Broomhead, Matthew E. Davies, and Jeremy Huke. Delay embeddings for forced systems. II. Stochastic forcing.Journal of Nonlinear Science, 13:519–577, 2003. 14

  48. [48]

    Carsen Stringer, Marius Pachitariu, Nicholas Steinmetz, Matteo Carandini, and Kenneth D. Harris. High-dimensional geometry of population responses in visual cortex.Nature, 571(7765):361–365, 2019

  49. [49]

    Detecting strange attractors in turbulence

    Floris Takens. Detecting strange attractors in turbulence. In David A. Rand and Lai-Sang Young, editors,Dynamical Systems and Turbulence, Warwick 1980, pages 366–381. Springer, 1981

  50. [50]

    Learning accurate path integration in ring attractor models of the head direction system.eLife, 11:e69841, 2022

    Pantelis Vafidis, David Owald, Tiziano D’Albis, and Richard Kempter. Learning accurate path integration in ring attractor models of the head direction system.eLife, 11:e69841, 2022

  51. [51]

    Willshaw

    Christoph von der Malsburg and David J. Willshaw. A mechanism for producing continuous neural mappings.Experimental Brain Research, Suppl. 1:463–469, 1976

  52. [52]

    Walker, Stephan Pohl, Rachel N

    Edgar Y. Walker, Stephan Pohl, Rachel N. Denison, David L. Barack, Jennifer Lee, Ned Block, Wei Ji Ma, and Florent Meyniel. Studying the neural representations of uncertainty.Nature Neuroscience, 26(11):1857–1867, 2023

  53. [53]

    Sejnowski

    Laurenz Wiskott and Terrence J. Sejnowski. Slow feature analysis: unsupervised learning of invariances.Neural Computation, 14(4):715–770, 2002

  54. [54]

    Wong, Robert S

    Adrian S. Wong, Robert S. Martin, and Daniel Q. Eckhardt. Contraction and synchronization in reservoir systems.Physical Review E, 110(6):064219, 2024. 15