Recognition: 3 theorem links
· Lean TheoremComplex Diffusion Maps with ω-Parameterized Kernels Revealing Inherent Harmonic Representations
Pith reviewed 2026-05-08 19:30 UTC · model grok-4.3
The pith
Complex Diffusion Maps use ω-parameterized complex kernels to reveal dominant harmonic representations that preserve angular structure in high-dimensional data.
A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.
Core claim
The central discovery is that a unified family of ω-parameterized complex-valued kernels defines diffusion operators whose spectra yield complex harmonic maps. These maps preserve angular structure in the complex plane rather than relying on magnitudes alone, and the complex kernel form amplifies distinctions among confusable samples. The approach establishes well-defined diffusion distances and provides an optimization interpretation for the embeddings.
What carries the argument
The ω-parameterized complex-valued kernel, which generates a diffusion operator and its complex eigenmaps that maintain angular relationships.
Load-bearing premise
The complex kernels produce diffusion operators with spectra that give meaningful complex harmonic maps preserving angular structure beyond real-valued kernels.
What would settle it
Observing no improvement in sample separation or eigengap clarity when applying the method to synthetic high-noise data with known confusable groups compared to standard diffusion maps would challenge the claim.
Figures
read the original abstract
In this paper, we propose Complex Diffusion Maps (CDM), a novel diffusion mapping framework that aims to reveal the dominant complex harmonics of high-dimensional data. Inspired by the local Gaussian kernel relevant to the heat equation and the nonlocal Schr\"odinger kernel relevant to the Schr\"odinger equation, we propose a unified family of $\omega$-parameterized complex-valued kernels for the trade-off between local and nonlocal connections. We establish the theoretical foundation based on the operator spectrum theory, where the corresponding diffusion operator, diffusion distance, and complex harmonic maps are well-defined. An optimization-based interpretation of the maps is also developed, aiming to preserve angular structure in the complex diffusion space rather than relying solely on real-valued magnitude. We extensively evaluate CDM on both synthetic and real-world datasets. The complex-valued kernel amplifies differences among easily confusable samples, improving discriminative power over both linear and nonlinear methods based on real-valued kernels. CDM remains robust in high-noise settings, yielding a clearer eigengap that enhances spectral separation. For resting-state fMRI data, CDM captures more strongly correlated and nonlocal spatiotemporal dynamics. Without task-specific tuning, CDM achieves competitive performance on a public EEG sleep dataset, while maintaining high computational efficiency compared with both traditional machine learning and deep neural network approaches, highlighting its generality and practical value.
Editorial analysis
A structured set of objections, weighed in public.
Referee Report
Summary. The paper proposes Complex Diffusion Maps (CDM) using a unified family of ω-parameterized complex-valued kernels that interpolate between local Gaussian kernels (heat equation) and nonlocal Schrödinger kernels. It claims to establish a theoretical foundation via operator spectrum theory under which the associated diffusion operator, diffusion distance, and complex harmonic maps are well-defined, with an optimization-based interpretation that preserves angular structure in the complex plane. Empirical evaluations on synthetic data, real-world datasets, resting-state fMRI, and a public EEG sleep dataset report improved discriminative power over real-valued linear and nonlinear methods, greater robustness in high-noise regimes with clearer eigengaps, and competitive performance with high computational efficiency.
Significance. If the extension of diffusion-map theory to complex kernels is rigorously justified and the reported empirical advantages hold under replication, the work could provide a practical tool for datasets exhibiting phase or angular structure, such as neuroimaging time series, while offering a parameter-controlled trade-off between local and nonlocal connectivity. The optimization view of angular preservation and the unified kernel family are potentially generalizable strengths.
major comments (2)
- [Abstract] Abstract: The claim that 'the corresponding diffusion operator, diffusion distance, and complex harmonic maps are well-defined' via operator spectrum theory is load-bearing for all downstream assertions (discriminative power, eigengap clarity, angular preservation). For a general complex kernel K(x,y) the induced integral operator need not be self-adjoint or normal on L², so the classical spectral theorem supplying a countable orthonormal basis of eigenfunctions with real eigenvalues does not apply directly; no explicit inner-product adjustment, Hermitian symmetrization, or compactness/normalcy proof is referenced to restore these properties.
- [Theoretical development (presumed §3–4)] Theoretical development (presumed §3–4): The optimization-based interpretation that the maps 'preserve angular structure in the complex diffusion space' presupposes that the complex spectrum yields geometrically meaningful harmonics beyond magnitude; without a concrete demonstration that the ω-parameterized kernel produces a compact normal operator (or equivalent) whose eigenfunctions retain angular interpretability, the superiority over real-valued kernels in confusable-sample and high-noise settings remains an unverified extension.
minor comments (2)
- [Abstract and Experiments] The abstract and evaluation sections would benefit from explicit statements of the precise values of ω used in each experiment and any sensitivity analysis, as the single free parameter is central to the claimed trade-off.
- [Experiments] Quantitative tables or figures reporting eigengap sizes, classification accuracies, or correlation strengths with error bars or statistical tests are needed to substantiate claims of 'clearer eigengap' and 'more strongly correlated' dynamics.
Simulated Author's Rebuttal
We thank the referee for their careful reading and constructive comments on our manuscript. We address each major comment below and commit to revisions that will strengthen the theoretical justifications as requested.
read point-by-point responses
-
Referee: [Abstract] Abstract: The claim that 'the corresponding diffusion operator, diffusion distance, and complex harmonic maps are well-defined' via operator spectrum theory is load-bearing for all downstream assertions (discriminative power, eigengap clarity, angular preservation). For a general complex kernel K(x,y) the induced integral operator need not be self-adjoint or normal on L², so the classical spectral theorem supplying a countable orthonormal basis of eigenfunctions with real eigenvalues does not apply directly; no explicit inner-product adjustment, Hermitian symmetrization, or compactness/normalcy proof is referenced to restore these properties.
Authors: We appreciate this observation. While our ω-parameterized kernels are constructed to satisfy the Hermitian symmetry condition K(y, x) = conjugate(K(x, y)), which ensures the integral operator is self-adjoint on the complex L² space, we acknowledge that the manuscript does not explicitly state the compactness and normality arguments. In the revised version, we will add a dedicated proposition in Section 3 (or an appendix) proving that, under standard assumptions on the data manifold being compact and the kernel being continuous and bounded, the operator is compact and self-adjoint, thereby justifying the application of the spectral theorem. This will include the explicit verification for the boundary cases (ω=0 corresponding to real Gaussian and ω=1 to Schrödinger-like) and the interpolation. We believe this addresses the concern without altering the core claims. revision: yes
-
Referee: [Theoretical development (presumed §3–4)] Theoretical development (presumed §3–4): The optimization-based interpretation that the maps 'preserve angular structure in the complex diffusion space' presupposes that the complex spectrum yields geometrically meaningful harmonics beyond magnitude; without a concrete demonstration that the ω-parameterized kernel produces a compact normal operator (or equivalent) whose eigenfunctions retain angular interpretability, the superiority over real-valued kernels in confusable-sample and high-noise settings remains an unverified extension.
Authors: We agree that a more explicit link between the operator properties and the angular interpretability would strengthen the paper. The optimization view is derived from viewing the embedding as minimizing a complex-valued loss that penalizes deviations in argument (phase) as well as magnitude. To demonstrate this, we will include in the revision a short subsection or example showing that for the proposed kernels, the eigenfunctions are complex-valued with phases that correspond to harmonic oscillations, and that the diffusion distance incorporates both real and imaginary parts. This will be supported by the spectral properties established in the new proposition. Regarding the empirical superiority, the experiments on synthetic data with confusable samples and high-noise regimes already illustrate the benefits, but we will add a note clarifying that these rely on the well-defined spectrum. We do not claim the superiority is solely due to the theory but is observed empirically when using the complex maps. revision: yes
Circularity Check
No significant circularity; derivation is self-contained.
full rationale
The paper defines a family of ω-parameterized complex kernels, invokes standard operator spectrum theory to define the diffusion operator/distance/maps, and provides an optimization view for angular preservation. All claims (eigengap, discriminative power, noise robustness) rest on these definitions plus empirical evaluation rather than reducing to fitted parameters renamed as predictions or to self-citation chains. No equation equates a derived quantity to its own input by construction, and no load-bearing uniqueness theorem is imported from the authors' prior work. The framework extends real-valued diffusion maps without internal circular reduction.
Axiom & Free-Parameter Ledger
free parameters (1)
- ω
axioms (1)
- domain assumption Operator spectrum theory applies to the diffusion operator induced by the ω-parameterized complex kernel.
invented entities (1)
-
Complex harmonic maps
no independent evidence
Lean theorems connected to this paper
-
IndisputableMonolith/Cost.lean (J = ½(x+x⁻¹)−1) and Foundation/AlphaCoordinateFixation.leancost_alpha_one_eq_jcost / J_uniquely_calibrated_via_higher_derivative unclear?
unclearRelation between the paper passage and the cited Recognition theorem.
We propose a family of ω-parameterized complex-valued kernels that unify and generalize the Gaussian kernel reflecting local similarity and the Schrödinger kernel reflecting nonlocal connections to obtain a trade-off via the parameter ω.
-
IndisputableMonolith/Foundation/AlphaDerivationExplicit.leanalphaProvenanceCert (parameter-free derivation) unclear?
unclearRelation between the paper passage and the cited Recognition theorem.
Both the kernel bandwidth σ and the complex parameter ω are selected via grid search.
-
IndisputableMonolith/Cost/FunctionalEquation.leanwashburn_uniqueness_aczel unclear?
unclearRelation between the paper passage and the cited Recognition theorem.
K(x_i, x_j) = exp(-ω ||x_i - x_j||² / σ²), where ω = e^{iθ}, θ ∈ [-π/2, 0].
What do these tags mean?
- matches
- The paper's claim is directly supported by a theorem in the formal canon.
- supports
- The theorem supports part of the paper's argument, but the paper may add assumptions or extra steps.
- extends
- The paper goes beyond the formal theorem; the theorem is a base layer rather than the whole result.
- uses
- The paper appears to rely on the theorem as machinery.
- contradicts
- The paper's claim conflicts with a theorem or certificate in the canon.
- unclear
- Pith found a possible connection, but the passage is too broad, indirect, or ambiguous to say the theorem truly supports the claim.
Reference graph
Works this paper leans on
-
[1]
Principal component analysis.Chemometrics and Intelligent Laboratory Systems, 2(1-3):37–52, 1987
Svante Wold, Kim Esbensen, and Paul Geladi. Principal component analysis.Chemometrics and Intelligent Laboratory Systems, 2(1-3):37–52, 1987
1987
-
[2]
Multidimensional scaling
Mark L Davison and Stephen G Sireci. Multidimensional scaling. InHandbook of Applied Multivariate Statistics and Mathematical Modeling, pages 323–352. Elsevier, 2000
2000
-
[3]
Kernel principal component analysis
Bernhard Sch¨ olkopf, Alexander Smola, and Klaus-Robert M¨ uller. Kernel principal component analysis. InInternational Conference on Artificial Neural Networks, pages 583–588. Springer, 1997
1997
-
[4]
A global geometric framework for nonlinear dimensionality reduction.Science, 290(5500):2319–2323, 2000
Joshua B Tennenbaum, Vin de Silva, and John C Langford. A global geometric framework for nonlinear dimensionality reduction.Science, 290(5500):2319–2323, 2000
2000
-
[5]
Nonlinear dimensionality reduction by locally linear embedding.Science, 290(5500):2323–2326, 2000
Sam T Roweis and Lawrence K Saul. Nonlinear dimensionality reduction by locally linear embedding.Science, 290(5500):2323–2326, 2000
2000
-
[6]
Laplacian eigenmaps for dimensionality reduction and data representation.Neural Computation, 15(6):1373–1396, 2003
Mikhail Belkin and Partha Niyogi. Laplacian eigenmaps for dimensionality reduction and data representation.Neural Computation, 15(6):1373–1396, 2003
2003
-
[7]
Ronald R Coifman, Stephane Lafon, Ann B Lee, Mauro Maggioni, Boaz Nadler, Frederick Warner, and Steven W Zucker. Geometric diffusions as a tool for harmonic analysis and structure definition of data: Diffusion maps.Proceedings of the National Academy of Sciences of the United States of America, 102(21):7426–7431, 2005
2005
-
[8]
Visualizing data using t-sne.Journal of Machine Learning Research, 9(11), 2008
Laurens Van der Maaten and Geoffrey Hinton. Visualizing data using t-sne.Journal of Machine Learning Research, 9(11), 2008
2008
-
[9]
American Mathematical Society, 2022
Lawrence C Evans.Partial differential equations. American Mathematical Society, 2022
2022
-
[10]
Cambridge University Press, 1997
Steven Rosenberg.The Laplacian on a Riemannian manifold: An introduction to analysis on manifolds. Cambridge University Press, 1997
1997
-
[11]
The complex Gaussian kernel LMS algorithm
Pantelis Bouboulis and Sergios Theodoridis. The complex Gaussian kernel LMS algorithm. In International Conference on Artificial Neural Networks, pages 11–20. Springer, 2010
2010
-
[12]
Complex kernels for proper complex-valued signals: A review
Rafael Boloix-Tortosa, F Javier Pay´ an-Somet, Eva Arias-de Reyna, and Juan Jos´ e Murillo- Fuentes. Complex kernels for proper complex-valued signals: A review. InEuropean Signal Processing Conference, pages 2371–2375. IEEE, 2015
2015
-
[13]
Complex networks with complex weights.Physical Review E, 109(2):024314, 2024
Lucas B¨ ottcher and Mason A Porter. Complex networks with complex weights.Physical Review E, 109(2):024314, 2024
2024
-
[14]
Quantum theory based on real numbers can be experimentally falsified.Nature, 600(7890):625–629, 2021
Marc-Olivier Renou, David Trillo, Mirjam Weilenmann, Thinh P Le, Armin Tavakoli, Nicolas Gisin, Antonio Ac´ ın, and Miguel Navascu´ es. Quantum theory based on real numbers can be experimentally falsified.Nature, 600(7890):625–629, 2021
2021
-
[15]
Ruling out real-valued standard formalism of quantum theory.Physical Review Letters, 128(4):040403, 2022
Ming-Cheng Chen, Can Wang, Feng-Ming Liu, Jian-Wen Wang, Chong Ying, Zhong-Xia Shang, Yulin Wu, Ming Gong, Hui Deng, F-T Liang, et al. Ruling out real-valued standard formalism of quantum theory.Physical Review Letters, 128(4):040403, 2022
2022
-
[16]
Fluxes, Laplacians, and Kasteleyn’s theorem
Elliott H Lieb and Michael Loss. Fluxes, Laplacians, and Kasteleyn’s theorem. InStatistical Mechanics: Selecta of Elliott H. Lieb, pages 457–483. Springer, 1993. 40
1993
-
[17]
Magnetic eigenmaps for the visualization of directed networks.Applied and Computational Harmonic Analysis, 44(1):189–199, 2018
Micha¨ el Fanuel, Carlos M Ala´ ız,´Angela Fern´ andez, and Johan AK Suykens. Magnetic eigenmaps for the visualization of directed networks.Applied and Computational Harmonic Analysis, 44(1):189–199, 2018
2018
-
[18]
Magnet: A neural network for directed graphs
Xitong Zhang, Yixuan He, Nathan Brugnone, Michael Perlmutter, and Matthew Hirn. Magnet: A neural network for directed graphs. InAdvances in Neural Information Processing Systems, pages 27003–27015, 2021
2021
-
[19]
An optical neural chip for implementing complex-valued neural network.Nature Communications, 12(1):457, 2021
Hui Zhang, Mile Gu, XD Jiang, Jayne Thompson, Hong Cai, Stefano Paesani, Raffaele Santagati, Anthony Laing, Y Zhang, Man-Hong Yung, et al. An optical neural chip for implementing complex-valued neural network.Nature Communications, 12(1):457, 2021
2021
-
[20]
Complex-valued neural networks: A comprehensive survey.IEEE/CAA Journal of Automatica Sinica, 9(8):1406–1426, 2022
ChiYan Lee, Hideyuki Hasegawa, and Shangce Gao. Complex-valued neural networks: A comprehensive survey.IEEE/CAA Journal of Automatica Sinica, 9(8):1406–1426, 2022
2022
-
[21]
Springer, 1998
Yuri A Kuznetsov, Iu A Kuznetsov, and Y Kuznetsov.Elements of applied bifurcation theory. Springer, 1998
1998
-
[22]
Signal-coupled subthreshold Hopf-type systems show a sharpened collective response.Physical Review Letters, 116(10):108101, 2016
Florian Gomez, Tom Lorimer, and Ruedi Stoop. Signal-coupled subthreshold Hopf-type systems show a sharpened collective response.Physical Review Letters, 116(10):108101, 2016
2016
-
[23]
Complex harmonics reveal low-dimensional manifolds of critical brain dynamics.Physical Review E, 111(1):014410, 2025
Gustavo Deco, Yonatan Sanz Perl, and Morten L Kringelbach. Complex harmonics reveal low-dimensional manifolds of critical brain dynamics.Physical Review E, 111(1):014410, 2025
2025
-
[24]
Fractional Schr¨ odinger equation.Physical Review E, 66(5):056108, 2002
Nick Laskin. Fractional Schr¨ odinger equation.Physical Review E, 66(5):056108, 2002
2002
-
[25]
Fractional dynamics of systems with long-range interaction.Communications in Nonlinear Science and Numerical Simulation, 11(8):885–898, 2006
Vasily E Tarasov and George M Zaslavsky. Fractional dynamics of systems with long-range interaction.Communications in Nonlinear Science and Numerical Simulation, 11(8):885–898, 2006
2006
-
[26]
American Mathematical Society, 1990
Walter A Strauss.Nonlinear wave equations. American Mathematical Society, 1990
1990
-
[27]
Rare long-range cortical connections enhance human information processing.Current Biology, 31(20):4436–4448, 2021
Gustavo Deco, Yonathan Sanz Perl, Peter Vuust, Enzo Tagliazucchi, Henry Kennedy, and Morten L Kringelbach. Rare long-range cortical connections enhance human information processing.Current Biology, 31(20):4436–4448, 2021
2021
-
[28]
Non-local schr¨ odinger diffusion model reveals mechanisms of critical brain dynamics.Cell Reports Physical Science, 2025
Gustavo Deco, Yonatan Sanz Perl, and Morten L Kringelbach. Non-local schr¨ odinger diffusion model reveals mechanisms of critical brain dynamics.Cell Reports Physical Science, 2025
2025
-
[29]
Courier Corporation, 2013
Naum Il’ich Akhiezer and Izrail Markovich Glazman.Theory of linear operators in Hilbert space. Courier Corporation, 2013
2013
-
[30]
Vector diffusion maps and the connection Laplacian.Communications on Pure and Applied Mathematics, 65(8):1067–1144, 2012
Amit Singer and H-T Wu. Vector diffusion maps and the connection Laplacian.Communications on Pure and Applied Mathematics, 65(8):1067–1144, 2012
2012
-
[31]
Discrete magnetic Laplacian.Communications in Mathematical Physics, 164(2):259–275, 1994
Mikhail A Shubin. Discrete magnetic Laplacian.Communications in Mathematical Physics, 164(2):259–275, 1994
1994
-
[32]
Diffusion representation for asymmetric kernels.Applied Numerical Mathematics, 166:208–226, 2021
Alvaro Almeida Gomez, Antˆ onio J Silva Neto, and Jorge P Zubelli. Diffusion representation for asymmetric kernels.Applied Numerical Mathematics, 166:208–226, 2021
2021
-
[33]
Diffusion representation for asymmetric kernels via magnetic transform
Mingzhen He, Fan He, Ruikai Yang, and Xiaolin Huang. Diffusion representation for asymmetric kernels via magnetic transform. InAdvances in Neural Information Processing Systems, 2023. 41
2023
-
[34]
Gaussian filters for nonlinear filtering problems.IEEE Transac- tions on Automatic Control, 45(5):910–927, 2000
Kazufumi Ito and Kaiqi Xiong. Gaussian filters for nonlinear filtering problems.IEEE Transac- tions on Automatic Control, 45(5):910–927, 2000
2000
-
[35]
Asymptotic behaviors of support vector machines with Gaussian kernel.Neural Computation, 15(7):1667–1689, 2003
S Sathiya Keerthi and Chih-Jen Lin. Asymptotic behaviors of support vector machines with Gaussian kernel.Neural Computation, 15(7):1667–1689, 2003
2003
-
[36]
Convergence of Laplacian eigenmaps
Mikhail Belkin and Partha Niyogi. Convergence of Laplacian eigenmaps. InAdvances in Neural Information Processing Systems, 2006
2006
-
[37]
Towards a theoretical foundation for Laplacian-based manifold methods.Journal of Computer and System Sciences, 74(8):1289–1308, 2008
Mikhail Belkin and Partha Niyogi. Towards a theoretical foundation for Laplacian-based manifold methods.Journal of Computer and System Sciences, 74(8):1289–1308, 2008
2008
-
[38]
Long range arena: A benchmark for efficient transformers
Yi Tay, Mostafa Dehghani, Samira Abnar, Yikang Shen, Dara Bahri, Philip Pham, Jinfeng Rao, Liu Yang, Sebastian Ruder, and Donald Metzler. Long range arena: A benchmark for efficient transformers. InInternational Conference on Learning Representations, 2021
2021
-
[39]
Cambridge University Press, 2018
David J Griffiths and Darrell F Schroeter.Introduction to quantum mechanics. Cambridge University Press, 2018
2018
-
[40]
Princeton University Press, 2010
Anthony Zee.Quantum field theory in a nutshell. Princeton University Press, 2010
2010
-
[41]
Complex symmetric operators and applications.Transactions of the American Mathematical Society, 358(3):1285–1315, 2006
Stephan Garcia and Mihai Putinar. Complex symmetric operators and applications.Transactions of the American Mathematical Society, 358(3):1285–1315, 2006
2006
-
[42]
Complex symmetric operators and applications ii.Transac- tions of the American Mathematical Society, 359(8):3913–3931, 2007
Stephan Garcia and Mihai Putinar. Complex symmetric operators and applications ii.Transac- tions of the American Mathematical Society, 359(8):3913–3931, 2007
2007
-
[43]
Springer Science & Business Media, 1994
John B Conway.A course in functional analysis. Springer Science & Business Media, 1994
1994
-
[44]
American Mathematical Society, 1997
Fan RK Chung.Spectral graph theory. American Mathematical Society, 1997
1997
-
[45]
Using the Nystr¨ om method to speed up kernel machines
Christopher Williams and Matthias Seeger. Using the Nystr¨ om method to speed up kernel machines. InAdvances in Neural Information Processing Systems, 2000
2000
-
[46]
The wu-minn Human Connectome Project: An overview.Neuroimage, 80:62–79, 2013
David C Van Essen, Stephen M Smith, Deanna M Barch, Timothy EJ Behrens, Essa Yacoub, Kamil Ugurbil, Wu-Minn HCP Consortium, et al. The wu-minn Human Connectome Project: An overview.Neuroimage, 80:62–79, 2013
2013
-
[47]
A multi-modal parcellation of human cerebral cortex.Nature, 536(7615):171–178, 2016
Matthew F Glasser, Timothy S Coalson, Emma C Robinson, Carl D Hacker, John Harwell, Essa Yacoub, Kamil Ugurbil, Jesper Andersson, Christian F Beckmann, Mark Jenkinson, et al. A multi-modal parcellation of human cerebral cortex.Nature, 536(7615):171–178, 2016
2016
-
[48]
Edge-centric functional network representations of human cerebral cortex reveal overlapping system-level architecture.Nature Neuroscience, 23(12):1644–1654, 2020
Joshua Faskowitz, Farnaz Zamani Esfahlani, Youngheun Jo, Olaf Sporns, and Richard F Betzel. Edge-centric functional network representations of human cerebral cortex reveal overlapping system-level architecture.Nature Neuroscience, 23(12):1644–1654, 2020
2020
-
[49]
ISRUC-Sleep: A comprehensive public dataset for sleep researchers.Computer Methods and Programs in Biomedicine, 124:180–192, 2016
Sirvan Khalighi, Teresa Sousa, Jos´ e Moutinho Santos, and Urbano Nunes. ISRUC-Sleep: A comprehensive public dataset for sleep researchers.Computer Methods and Programs in Biomedicine, 124:180–192, 2016
2016
-
[50]
Geometry from a time series.Physical Review Letters, 45(9):712, 1980
Norman H Packard, James P Crutchfield, J Doyne Farmer, and Robert S Shaw. Geometry from a time series.Physical Review Letters, 45(9):712, 1980. 42
1980
-
[51]
Detecting strange attractors in turbulence
Floris Takens. Detecting strange attractors in turbulence. InDynamical Systems and Turbulence, Warwick 1980, pages 366–381. Springer, 2006
1980
-
[52]
Unsupervised learning
Trevor Hastie, Robert Tibshirani, and Jerome Friedman. Unsupervised learning. InThe Elements of Statistical Learning: Data Mining, Inference, and Prediction, pages 485–585. Springer, 2008
2008
-
[53]
Supervised learning
P´ adraig Cunningham, Matthieu Cord, and Sarah Jane Delany. Supervised learning. InMachine Learning Techniques for Multimedia: Case Studies on Organization and Retrieval, pages 21–49. Springer, 2008
2008
-
[54]
Multivariate observations
J MacQueen. Multivariate observations. InBerkeley Symposium on Mathematical Statisticsand Probability, pages 281–297. University of California Press Oakland, 1967
1967
-
[55]
Neural manifolds for the control of movement.Neuron, 94(5):978–984, 2017
Juan A Gallego, Matthew G Perich, Lee E Miller, and Sara A Solla. Neural manifolds for the control of movement.Neuron, 94(5):978–984, 2017
2017
-
[56]
A neural manifold view of the brain
Matthew G Perich, Devika Narain, and Juan A Gallego. A neural manifold view of the brain. Nature Neuroscience, 28(8):1582–1597, 2025
2025
-
[57]
SalientSleepNet: Multimodal salient wave detection network for sleep staging
Ziyu Jia, Youfang Lin, Jing Wang, Xuehui Wang, Peiyi Xie, and Yingbin Zhang. SalientSleepNet: Multimodal salient wave detection network for sleep staging. InInternational Joint Conference on Artificial Intelligence, pages 2614–2620, 8 2021
2021
-
[58]
Ziyu Jia, Youfang Lin, Jing Wang, Xiaojun Ning, Yuanlai He, Ronghao Zhou, Yuhan Zhou, and Li-wei H Lehman. Multi-view spatial-temporal graph convolutional networks with domain gener- alization for sleep stage classification.IEEE Transactions on Neural Systems and Rehabilitation Engineering, 29:1977–1986, 2021
1977
-
[59]
Emadeldeen Eldele, Zhenghua Chen, Chengyu Liu, Min Wu, Chee-Keong Kwoh, Xiaoli Li, and Cuntai Guan. An attention-based deep learning approach for sleep stage classification with single-channel EEG.IEEE Transactions on Neural Systems and Rehabilitation Engineering, 29:809–818, 2021
2021
-
[60]
Ziyu Jia, Xiyang Cai, Gaoxing Zheng, Jing Wang, and Youfang Lin. SleepPrintnet: A multivariate multimodal neural network based on physiological time-series for automatic sleep staging.IEEE Transactions on Artificial Intelligence, 1(3):248–257, 2021
2021
-
[61]
Kipf and Max Welling
Thomas N. Kipf and Max Welling. Semi-supervised classification with graph convolutional networks. InInternational Conference on Learning Representations, 2017
2017
-
[62]
The minimal preprocessing pipelines for the Human Connectome Project.Neuroimage, 80:105–124, 2013
Matthew F Glasser, Stamatios N Sotiropoulos, J Anthony Wilson, Timothy S Coalson, Bruce Fischl, Jesper L Andersson, Junqian Xu, Saad Jbabdi, Matthew Webster, Jonathan R Polimeni, et al. The minimal preprocessing pipelines for the Human Connectome Project.Neuroimage, 80:105–124, 2013
2013
-
[63]
Geometric harmonics: A novel tool for multiscale out-of-sample extension of empirical functions.Applied and Computational Harmonic Analysis, 21(1):31–52, 2006
Ronald R Coifman and St´ ephane Lafon. Geometric harmonics: A novel tool for multiscale out-of-sample extension of empirical functions.Applied and Computational Harmonic Analysis, 21(1):31–52, 2006
2006
-
[64]
Time-series forecasting using manifold learning, radial basis function interpolation, and geometric harmonics.Chaos: An Interdisciplinary Journal of Nonlinear Science, 32(8), 2022
Panagiotis G Papaioannou, Ronen Talmon, Ioannis G Kevrekidis, and Constantinos Siettos. Time-series forecasting using manifold learning, radial basis function interpolation, and geometric harmonics.Chaos: An Interdisciplinary Journal of Nonlinear Science, 32(8), 2022. 43
2022
-
[65]
Data-driven control of agent-based models: An equation/variable-free machine learning approach.Journal of Computational Physics, 478:111953, 2023
Dimitrios G Patsatzis, Lucia Russo, Ioannis G Kevrekidis, and Constantinos Siettos. Data-driven control of agent-based models: An equation/variable-free machine learning approach.Journal of Computational Physics, 478:111953, 2023
2023
-
[66]
A novel multi-class EEG-based sleep stage classification system.IEEE Transactions on Neural Systems and Rehabilitation Engineering, 26(1):84–95, 2017
Pejman Memar and Farhad Faradji. A novel multi-class EEG-based sleep stage classification system.IEEE Transactions on Neural Systems and Rehabilitation Engineering, 26(1):84–95, 2017
2017
-
[67]
Ensemble svm method for automatic sleep stage classification.IEEE Transactions on Instrumentation and Measurement, 67(6):1258–1265, 2018
Emina Alickovic and Abdulhamit Subasi. Ensemble svm method for automatic sleep stage classification.IEEE Transactions on Instrumentation and Measurement, 67(6):1258–1265, 2018
2018
-
[68]
Mixed neural network approach for temporal sleep stage classification.IEEE Transactions on Neural Systems and Rehabilitation Engineering, 26(2):324–333, 2017
Hao Dong, Akara Supratak, Wei Pan, Chao Wu, Paul M Matthews, and Yike Guo. Mixed neural network approach for temporal sleep stage classification.IEEE Transactions on Neural Systems and Rehabilitation Engineering, 26(2):324–333, 2017
2017
-
[69]
Comparing partitions.Journal of Classification, 2(1):193– 218, 1985
Lawrence Hubert and Phipps Arabie. Comparing partitions.Journal of Classification, 2(1):193– 218, 1985
1985
-
[70]
Information theoretic measures for clusterings comparison: Is a correction for chance necessary? InInternational Conference on Machine Learning, pages 1073–1080, 2009
Nguyen Xuan Vinh, Julien Epps, and James Bailey. Information theoretic measures for clusterings comparison: Is a correction for chance necessary? InInternational Conference on Machine Learning, pages 1073–1080, 2009
2009
-
[71]
The use of multiple measurements in taxonomic problems.Annals of Eugenics, 7(2):179–188, 1936
Ronald A Fisher. The use of multiple measurements in taxonomic problems.Annals of Eugenics, 7(2):179–188, 1936
1936
-
[72]
Support-vector networks.Machine Learning, 20:273–297, 1995
Corinna Cortes and Vladimir Vapnik. Support-vector networks.Machine Learning, 20:273–297, 1995. 44
1995
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.