Recognition: unknown
Sinc Kolmogorov-Arnold Network and Its Applications on Physics-informed Neural Networks
read the original abstract
In this paper, we propose to use Sinc interpolation in the context of Kolmogorov-Arnold Networks, neural networks with learnable activation functions, which recently gained attention as alternatives to multilayer perceptron. Many different function representations have already been tried, but we show that Sinc interpolation proposes a viable alternative, since it is known in numerical analysis to represent well both smooth functions and functions with singularities. This is important not only for function approximation but also for the solutions of partial differential equations with physics-informed neural networks. Through a series of experiments, we show that SincKANs provide better results in almost all of the examples we have considered.
This paper has not been read by Pith yet.
Forward citations
Cited by 2 Pith papers
-
Scale-Parameter Selection in Gaussian Kolmogorov-Arnold Networks
A stable operating interval for the Gaussian scale parameter ε in KANs is ε ∈ [1/(G-1), 2/(G-1)], derived from first-layer feature geometry and validated across multiple approximation and physics-informed problems.
-
General Explicit Network (GEN): A novel deep learning architecture for solving partial differential equations
GEN is a neural network that solves PDEs by constructing explicit function approximations from basis functions based on prior PDE knowledge, yielding more robust and extensible solutions than standard PINNs.
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.