An intrinsic effective sample size for manifold MCMC is defined via kernel discrepancy as the number of independent draws yielding equivalent expected squared discrepancy to the target.
hub
Cambridge University Press
10 Pith papers cite this work, alongside 421 external citations. Polarity classification is still indexing.
hub tools
citation-role summary
citation-polarity summary
years
2026 10verdicts
UNVERDICTED 10roles
background 1polarities
background 1representative citing papers
A second-order method achieves local quadratic convergence on the Stiefel manifold without retractions by combining a modified Newton tangent step with Newton-Schulz normal steps for constraint satisfaction.
The profile maximum likelihood estimator for the location in anisotropic hyperbolic wrapped normal models is strongly consistent, asymptotically normal, and attains the Hájek-Le Cam minimax lower bound under squared geodesic loss.
Implicit Manifold-valued Diffusions (IMDs) are data-driven SDEs built from proximity graphs that converge in law to smooth manifold diffusions as sample count increases.
A Riemannian L-BFGS method with adapted Cauchy-point bound handling outperforms classical interior-point and L-BFGS-B solvers on mixed manifold-plus-bounds problems by orders of magnitude.
Joint location-scale minimization for geometric medians on product manifolds degenerates to marginal medians, and three new scale-selection methods restore identifiability with asymptotic guarantees.
Negative curvature makes barrier parameters for geodesic balls and triangles in hyperbolic space grow polynomially with diameter, blocking efficient interior-point methods for exponentially large domains in scaling problems.
A generalized zeroth-order method samples random directions on the sphere to optimize quotients of quadratics, estimates Riemannian derivatives with surrogates, and yields an accelerated algorithm outperforming prior work.
Monotonic Basin Hopping outperforms MultiStart for locating lower-energy ground states in the random field XY model after reformulating the Hamiltonian on spheres for Riemannian optimization.
A hybrid tensor network framework interpolates between classical and quantum models via controllable post-selection, with a trainable hyperparameter that complements bond dimension to enhance quantum machine learning.
citing papers explorer
-
Intrinsic effective sample size for manifold-valued Markov chain Monte Carlo via kernel discrepancy
An intrinsic effective sample size for manifold MCMC is defined via kernel discrepancy as the number of independent draws yielding equivalent expected squared discrepancy to the target.
-
A second-order method landing on the Stiefel manifold via Newton$\unicode{x2013}$Schulz iteration
A second-order method achieves local quadratic convergence on the Stiefel manifold without retractions by combining a modified Newton tangent step with Newton-Schulz normal steps for constraint satisfaction.
-
Profile Likelihood Inference for Anisotropic Hyperbolic Wrapped Normal Models on Hyperbolic Space
The profile maximum likelihood estimator for the location in anisotropic hyperbolic wrapped normal models is strongly consistent, asymptotically normal, and attains the Hájek-Le Cam minimax lower bound under squared geodesic loss.
-
Diffusion Processes on Implicit Manifolds
Implicit Manifold-valued Diffusions (IMDs) are data-driven SDEs built from proximity graphs that converge in law to smooth manifold diffusions as sample count increases.
-
A Riemannian quasi-Newton algorithm for optimization with Euclidean bounds
A Riemannian L-BFGS method with adapted Cauchy-point bound handling outperforms classical interior-point and L-BFGS-B solvers on mixed manifold-plus-bounds problems by orders of magnitude.
-
Scale selection for geometric medians on product manifolds
Joint location-scale minimization for geometric medians on product manifolds degenerates to marginal medians, and three new scale-selection methods restore identifiability with asymptotic guarantees.
-
Negative curvature obstructs the existence of good barriers for interior-point methods
Negative curvature makes barrier parameters for geodesic balls and triangles in hyperbolic space grow polynomially with diameter, blocking efficient interior-point methods for exponentially large domains in scaling problems.
-
Generalization of Zeroth-Order Method for Quotients of Quadratic Functions
A generalized zeroth-order method samples random directions on the sphere to optimize quotients of quadratics, estimates Riemannian derivatives with surrogates, and yields an accelerated algorithm outperforming prior work.
-
Nonconvex optimization methods for ground states in disordered continuous-spin models
Monotonic Basin Hopping outperforms MultiStart for locating lower-energy ground states in the random field XY model after reformulating the Hamiltonian on spheres for Riemannian optimization.
-
Entanglement is Half the Story: Post-Selection vs. Partial Traces
A hybrid tensor network framework interpolates between classical and quantum models via controllable post-selection, with a trainable hyperparameter that complements bond dimension to enhance quantum machine learning.