Any convex L-Lipschitz functional on a compact convex subset of a separable Hilbert space can be uniformly approximated to arbitrary accuracy by an explicit convex L-Lipschitz reconstruction from finitely many linear measurements, exactly implementable by a ReLU-MLP.
arXiv preprint arXiv:2407.18384 , year=
6 Pith papers cite this work. Polarity classification is still indexing.
years
2026 6verdicts
UNVERDICTED 6representative citing papers
Classification fields are infinite recursive hierarchical cluster structures generated by a local refinement rule, and a ReLU network predictor learned from finite prefixes can approximate the generator and extend it to deeper levels with exponential convergence in the completed cell metric.
Every fixed finite feedforward neural network definable in an o-minimal structure has finite sample complexity in the agnostic PAC setting.
Adaptivity never hinders uniform approximation of task families but its advantages vary across four scenarios when moving from unrestricted to ReLU-realizable regimes.
A neural-network approximation of heteroclinic dynamics, interpretable as an Amari-type neural-field system, reproduces sequential transitions among cognitive states.
Covariance-aware ridge and combined l1-l2 regularizers for neural networks yield better predictive performance and complexity control than standard penalties in simulations and applications to cooling-load prediction and leukemia classification.
citing papers explorer
-
Structure-Preserving Reconstruction of Convex Lipschitz Functionals on Hilbert Spaces from Finite Samples
Any convex L-Lipschitz functional on a compact convex subset of a separable Hilbert space can be uniformly approximated to arbitrary accuracy by an explicit convex L-Lipschitz reconstruction from finitely many linear measurements, exactly implementable by a ReLU-MLP.
-
Classification Fields: Arbitrarily Fine Recursive Hierarchical Clustering From Few Examples
Classification fields are infinite recursive hierarchical cluster structures generated by a local refinement rule, and a ReLU network predictor learned from finite prefixes can approximate the generator and extend it to deeper levels with exponential convergence in the completed cell metric.
-
Every Feedforward Neural Network Definable in an o-Minimal Structure Has Finite Sample Complexity
Every fixed finite feedforward neural network definable in an o-minimal structure has finite sample complexity in the agnostic PAC setting.
-
Adaptivity Under Realizability Constraints: Comparing In-Context and Agentic Learning
Adaptivity never hinders uniform approximation of task families but its advantages vary across four scenarios when moving from unrestricted to ReLU-realizable regimes.
-
Modeling sequential cognitive states via population level cortical dynamics
A neural-network approximation of heteroclinic dynamics, interpretable as an Amari-type neural-field system, reproduces sequential transitions among cognitive states.
-
Adaptive Norm-Based Regularization for Neural Networks
Covariance-aware ridge and combined l1-l2 regularizers for neural networks yield better predictive performance and complexity control than standard penalties in simulations and applications to cooling-load prediction and leukemia classification.