KANs with learnable univariate spline activations on edges achieve better accuracy than MLPs with fewer parameters, faster scaling, and direct visualization for scientific discovery.
The kolmogorov superposition theorem can break the curse of dimensionality when approximating high dimensional functions
3 Pith papers cite this work. Polarity classification is still indexing.
representative citing papers
For rational multivariate functions, the Kolmogorov Superposition Theorem allows variable decoupling by inspection with no computation using the Loewner Framework.
The work introduces a modulation-based analytical method for singularity proofs in singular PDEs and refines ML techniques like PINNs and KANs to identify blowup solutions, with application to the open 3D Keller-Segel problem.
citing papers explorer
-
KAN: Kolmogorov-Arnold Networks
KANs with learnable univariate spline activations on edges achieve better accuracy than MLPs with fewer parameters, faster scaling, and direct visualization for scientific discovery.
-
Variable decoupling and the Kolmogorov Superposition Theorem for rational functions
For rational multivariate functions, the Kolmogorov Superposition Theorem allows variable decoupling by inspection with no computation using the Loewner Framework.
-
Singularity Formation: Synergy in Theoretical, Numerical and Machine Learning Approaches
The work introduces a modulation-based analytical method for singularity proofs in singular PDEs and refines ML techniques like PINNs and KANs to identify blowup solutions, with application to the open 3D Keller-Segel problem.