Backdoors can be realized as statistically natural latent directions in modern neural networks, achieving high attack success with negligible clean accuracy loss and resisting existing defenses.
Random features for large-scale kernel machines.Advances in neural information processing systems, 20
3 Pith papers cite this work. Polarity classification is still indexing.
years
2026 3verdicts
UNVERDICTED 3representative citing papers
Directional Chebyshev harmonics enable spectral path regression for tabular data with closed-form training, competitive accuracy, and explicit interpretability.
Sparse RFNNs with sSVD via Lanczos-Golub-Kahan bidiagonalization maintain accuracy while improving efficiency and robustness for 1D steady convection-diffusion equations with strong advection.
citing papers explorer
-
Backdoor Channels Hidden in Latent Space: Cryptographic Undetectability in Modern Neural Networks
Backdoors can be realized as statistically natural latent directions in modern neural networks, achieving high attack success with negligible clean accuracy loss and resisting existing defenses.
-
Spectral Path Regression: Directional Chebyshev Harmonics for Interpretable Tabular Learning
Directional Chebyshev harmonics enable spectral path regression for tabular data with closed-form training, competitive accuracy, and explicit interpretability.
-
Sparse Random-Feature Neural Networks with Krylov-Based SVD for Singularly Perturbed ODE
Sparse RFNNs with sSVD via Lanczos-Golub-Kahan bidiagonalization maintain accuracy while improving efficiency and robustness for 1D steady convection-diffusion equations with strong advection.