Polynomial-time SDP and ellipsoid-based approximation of Kolmogorov widths yields efficient robust detection boundaries matching upper bounds up to polylog factors for structured constrained signals.
Title resolution pending
9 Pith papers cite this work, alongside 1,062 external citations. Polarity classification is still indexing.
years
2026 9verdicts
UNVERDICTED 9representative citing papers
Bayesian PINNs for elliptic PDEs have posteriors that contract around the true solution at near-optimal rates, with the prior adapting automatically to unknown smoothness.
Constructs a minimax-optimal adaptive test for constant volatility in the nonparametric Gaussian white noise model under infill asymptotics, measuring deviations via the ratio of sigma(t) to its L2-average.
Profile MLE for the regime-switching threshold in null-recurrent diffusion converges at rate n^{-(1+γ)/2} to the arg sup of a doubly stochastic drifted Poisson process involving local time of oscillating Brownian motion.
Out-of-distribution extrapolation is non-identifiable from in-distribution data alone; the feature map, label map, and model class supply the identifiability bias that determines whether a network succeeds or fails at OOD generalization.
MoFI-FLR recovers active covariates and identifies their true functional forms (simple or complex) in high-dimensional functional linear regressions.
Empirical Bernstein confidence intervals for kernel smoothers attain nominal coverage up to a remainder of order n to the minus 2S over 2S+1 while achieving minimax optimal widths under S-th order local smoothness.
A semi-supervised kernel two-sample test integrates unlabeled covariate data to achieve asymptotic normality under the null, higher power than standard kernel tests, and consistency against fixed and local alternatives.
DSL uses doubly robust pseudo-outcomes and a multi-output neural network to jointly estimate time-varying conditional average treatment effects for right-censored survival data.
citing papers explorer
-
Efficient Robust Constrained Signal Detection via Kolmogorov Width Approximations
Polynomial-time SDP and ellipsoid-based approximation of Kolmogorov widths yields efficient robust detection boundaries matching upper bounds up to polylog factors for structured constrained signals.
-
Posterior Concentration of Bayesian Physics-Informed Neural Networks for Elliptic PDEs
Bayesian PINNs for elliptic PDEs have posteriors that contract around the true solution at near-optimal rates, with the prior adapting automatically to unknown smoothness.
-
Sharp adaptive nonparametric testing for constant volatility
Constructs a minimax-optimal adaptive test for constant volatility in the nonparametric Gaussian white noise model under infill asymptotics, measuring deviations via the ratio of sigma(t) to its L2-average.
-
Self-organized regime switching in null-recurrent dynamics
Profile MLE for the regime-switching threshold in null-recurrent diffusion converges at rate n^{-(1+γ)/2} to the arg sup of a doubly stochastic drifted Poisson process involving local time of oscillating Brownian motion.
-
Does Your Neural Network Extrapolate? Feature Engineering as Identifiability Bias for OOD Generalization
Out-of-distribution extrapolation is non-identifiable from in-distribution data alone; the feature map, label map, and model class supply the identifiability bias that determines whether a network succeeds or fails at OOD generalization.
-
Model Form Identification in High-Dimensional Functional Linear Regressions
MoFI-FLR recovers active covariates and identifies their true functional forms (simple or complex) in high-dimensional functional linear regressions.
-
Empirical Bernstein Confidence Intervals for Kernel Smoothers: A Safe and Sharp Way to Exhaust Assumed Smoothness
Empirical Bernstein confidence intervals for kernel smoothers attain nominal coverage up to a remainder of order n to the minus 2S over 2S+1 while achieving minimax optimal widths under S-th order local smoothness.
-
A Semi-Supervised Kernel Two-Sample Test
A semi-supervised kernel two-sample test integrates unlabeled covariate data to achieve asymptotic normality under the null, higher power than standard kernel tests, and consistency against fixed and local alternatives.
-
Estimating heterogeneous treatment effects with survival outcomes via a deep survival learner
DSL uses doubly robust pseudo-outcomes and a multi-output neural network to jointly estimate time-varying conditional average treatment effects for right-censored survival data.