pith. machine review for the scientific record. sign in

arxiv: 1709.06173 · v1 · submitted 2017-09-18 · 💻 cs.IT · math.IT

Recognition: unknown

Robustness of Neural Networks against Storage Media Errors

Authors on Pith no claims yet
classification 💻 cs.IT math.IT
keywords neuralparametersstorageeccserrorsmedianetworksrobustness
0
0 comments X
read the original abstract

We study the trade-offs between storage/bandwidth and prediction accuracy of neural networks that are stored in noisy media. Conventionally, it is assumed that all parameters (e.g., weight and biases) of a trained neural network are stored as binary arrays and are error-free. This assumption is based upon the implementation of error correction codes (ECCs) that correct potential bit flips in storage media. However, ECCs add storage overhead and cause bandwidth reduction when loading the trained parameters during the inference. We study the robustness of deep neural networks when bit errors exist but ECCs are turned off for different neural network models and datasets. It is observed that more sophisticated models and datasets are more vulnerable to errors in their trained parameters. We propose a simple detection approach that can universally improve the robustness, which in some cases can be improved by orders of magnitude. We also propose an alternative binary representation of the parameters such that the distortion brought by bit flips is reduced and even theoretically vanishing when the number of bits to represent a parameter increases.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 2 Pith papers

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. RangeGuard: Efficient, Bounded Approximate Error Correction for Reliable DNNs

    cs.AR 2026-05 unverdicted novelty 7.0

    RangeGuard uses range identifiers to enable bounded approximate error correction, tolerating 64+ bit flips with 16-bit parity and no noticeable accuracy loss in DNN inference.

  2. Effective and Memory-Efficient Alternatives to ECC for Reliable Large-Scale DNNs

    cs.AR 2026-05 unverdicted novelty 5.0

    MSET and CEP deliver higher reliability than SECDED ECC for CNNs and Vision Transformers with zero memory overhead and substantially lower area and delay.