pith. machine review for the scientific record. sign in

arxiv: 2601.18672 · v3 · submitted 2026-01-26 · 💻 cs.LG

Recognition: unknown

A Dynamic Framework for Grid Adaptation in Kolmogorov-Arnold Networks

Spyros Rigas , Thanasis Papaioannou , Panagiotis Trakadas , Georgios Alexandridis

Authors on Pith no claims yet
classification 💻 cs.LG
keywords adaptationtrainingdensitygridcurvature-baseddatasetduringfeynman
0
0 comments X
read the original abstract

Kolmogorov-Arnold Networks (KANs) have recently demonstrated promising potential in scientific machine learning, partly due to their capacity for grid adaptation during training. However, existing adaptation strategies rely solely on input data density, failing to account for the geometric complexity of the target function or metrics calculated during network training. In this work, we propose a generalized framework that treats knot allocation as a density estimation task governed by Importance Density Functions (IDFs), allowing training dynamics to determine grid resolution. We introduce a curvature-based adaptation strategy and evaluate it across synthetic function fitting, regression on a subset of the Feynman dataset and different instances of the Helmholtz PDE, demonstrating that it significantly outperforms the standard input-based baseline. Specifically, our method yields average relative error reductions of 25.3% on synthetic functions, 9.4% on the Feynman dataset, and 23.3% on the PDE benchmark. Statistical significance is confirmed via Wilcoxon signed-rank tests, establishing curvature-based adaptation as a robust and computationally efficient alternative for KAN training.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 1 Pith paper

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. KANs need curvature: penalties for compositional smoothness

    cs.LG 2026-05 unverdicted novelty 7.0

    A curvature penalty for KANs, derived to respect compositional effects and equipped with a proven upper bound on full-model curvature, produces smoother activations while preserving accuracy.