Recognition: unknown
Gaussian Processes and Kernel Methods: A Review on Connections and Equivalences
read the original abstract
This paper is an attempt to bridge the conceptual gaps between researchers working on the two widely used approaches based on positive definite kernels: Bayesian learning or inference using Gaussian processes on the one side, and frequentist kernel methods based on reproducing kernel Hilbert spaces on the other. It is widely known in machine learning that these two formalisms are closely related; for instance, the estimator of kernel ridge regression is identical to the posterior mean of Gaussian process regression. However, they have been studied and developed almost independently by two essentially separate communities, and this makes it difficult to seamlessly transfer results between them. Our aim is to overcome this potential difficulty. To this end, we review several old and new results and concepts from either side, and juxtapose algorithmic quantities from each framework to highlight close similarities. We also provide discussions on subtle philosophical and theoretical differences between the two approaches.
This paper has not been read by Pith yet.
Forward citations
Cited by 8 Pith papers
-
Robust and Data-Adaptive Integration of Nonconcurrent Data in Platform Trials via Gaussian Processes
A Gaussian process model enables data-adaptive integration of nonconcurrent controls in platform trials, reducing posterior variance with a non-increasing bias bound.
-
To discretize continually: Mean shift interacting particle systems for Bayesian inference
Mean shift interacting particle systems generate weighted samples approximating expectations under unnormalized densities by minimizing MMD through normalizing-constant-invariant dynamics.
-
Kernel-based guarantees for nonlinear parametric models in Bayesian optimization
A kernel framework over parameter space yields confidence bounds for regularized nonlinear models on adaptive data, supporting convergence analysis in Bayesian optimization.
-
Pressure reconstruction from error-embedded gradient measurements: a Gaussian-process generalization of Green's function integration
Gaussian process regression reconstructs pressure from error-embedded gradients by treating the field as a random process with a fitted correlation kernel, generalizing Green's function integration as its zero-noise l...
-
Nearly-Optimal Algorithm for Adversarial Kernelized Bandits
Exponential-weight algorithm achieves Õ(√(T γ_T)) adversarial regret for kernelized bandits and is optimal up to logs for squared exponential and Matérn kernels, with an efficient Nyström variant.
-
Fast and Provably Accurate Sequential Designs using Hilbert Space Gaussian Processes
Hilbert space Gaussian process approximations enable closed-form IMSE acquisition functions with non-asymptotic error bounds for faster and more accurate sequential designs.
-
Online Sharp-Calibrated Bayesian Optimization
OSCBO adaptively balances Gaussian process sharpness and calibration in Bayesian optimization by casting hyperparameter selection as constrained online learning, while preserving sublinear regret bounds.
-
Adversarial Robustness of NTK Neural Networks
NTK networks achieve minimax optimal adversarial regression rates in Sobolev spaces with early stopping, but minimum-norm interpolants are vulnerable.
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.