Temporal correlations from lazy random walks enable efficient SGD learning of k-juntas via temporal-difference loss on ReLU networks, achieving linear sample complexity in d.
Title resolution pending
6 Pith papers cite this work. Polarity classification is still indexing.
years
2026 6verdicts
UNVERDICTED 6representative citing papers
Stochastic integer optimization has sample complexity that matches, undercuts, or exceeds the continuous case based on objective structure, with new tight bounds for nonconvex continuous problems.
A dynamic pruning reduction from agnostic to realizable online learning via weak-consistency oracles achieves O(T^{d_VC+1}) query complexity with near-optimal regret and supplies matching upper and lower bounds on the regret-oracle tradeoff.
A fair conformal classification method guarantees conditional coverage on adaptively identified subgroups defined via learned representations.
A unified framework derives non-asymptotic bounds on conditional miscoverage in conformal prediction via pointwise and L_p routes and gives a common view of existing methods.
Novel non-asymptotic uniform error bounds are derived for kernel regression under broad classes of non-Gaussian noise distributions that include correlated cases.
citing papers explorer
-
The Benefits of Temporal Correlations: SGD Learns k-Juntas from Random Walks Efficiently
Temporal correlations from lazy random walks enable efficient SGD learning of k-juntas via temporal-difference loss on ReLU networks, achieving linear sample complexity in d.
-
Sample Complexity of Stochastic Optimization with Integer Variables
Stochastic integer optimization has sample complexity that matches, undercuts, or exceeds the continuous case based on objective structure, with new tight bounds for nonconvex continuous problems.
-
Regret-Oracle Complexity Tradeoffs in Agnostic Online Learning
A dynamic pruning reduction from agnostic to realizable online learning via weak-consistency oracles achieves O(T^{d_VC+1}) query complexity with near-optimal regret and supplies matching upper and lower bounds on the regret-oracle tradeoff.
-
Fair Conformal Classification via Learning Representation-Based Groups
A fair conformal classification method guarantees conditional coverage on adaptively identified subgroups defined via learned representations.
-
A Unified Theory of Conditional Coverage in Conformal Prediction with Applications
A unified framework derives non-asymptotic bounds on conditional miscoverage in conformal prediction via pointwise and L_p routes and gives a common view of existing methods.
-
On Uniform Error Bounds for Kernel Regression under Non-Gaussian Noise
Novel non-asymptotic uniform error bounds are derived for kernel regression under broad classes of non-Gaussian noise distributions that include correlated cases.