Temporal correlations from lazy random walks enable efficient SGD learning of k-juntas via temporal-difference loss on ReLU networks, achieving linear sample complexity in d.
arXiv preprint arXiv:2010.08515 , year=
2 Pith papers cite this work. Polarity classification is still indexing.
2
Pith papers citing it
years
2026 2verdicts
UNVERDICTED 2representative citing papers
QCNN layers equivariant under pixel cyclic shifts are exactly characterized as Fourier-mode multiplexers after QFT, enabling a deep network with constant expected gradient norm at initialization.
citing papers explorer
-
The Benefits of Temporal Correlations: SGD Learns k-Juntas from Random Walks Efficiently
Temporal correlations from lazy random walks enable efficient SGD learning of k-juntas via temporal-difference loss on ReLU networks, achieving linear sample complexity in d.
-
Pixel-Translation-Equivariant Quantum Convolutional Neural Networks via Fourier Multiplexers
QCNN layers equivariant under pixel cyclic shifts are exactly characterized as Fourier-mode multiplexers after QFT, enabling a deep network with constant expected gradient norm at initialization.