pith. machine review for the scientific record. sign in

Dorefa-net: Training low bitwidth convolutional neural networks with low bitwidth gradients

8 Pith papers cite this work. Polarity classification is still indexing.

8 Pith papers citing it

years

2026 7 2017 1

representative citing papers

Mixed Precision Training

cs.AI · 2017-10-10 · accept · novelty 7.0

Mixed precision training uses FP16 for most computations, FP32 master weights for accumulation, and loss scaling to enable accurate training of large DNNs with halved memory usage.

SURGE: Surrogate Gradient Adaptation in Binary Neural Networks

cs.LG · 2026-05-09 · unverdicted · novelty 6.0

SURGE proposes a dual-path gradient compensator and adaptive scaler to learn better surrogate gradients for binary neural network training, outperforming prior methods on classification, detection, and language tasks.

Multibit neural inference in a N-ary crossbar architecture

cs.AR · 2026-04-28 · unverdicted · novelty 5.0

Simulation of 4-state MTJ crossbars achieves 94.48% MNIST accuracy for neural inference, close to 97.56% software baseline, with analysis showing quantization as primary error and an optimal number of states per cell.

Design and Implementation of BNN-Based Object Detection on FPGA

cs.AR · 2026-05-05 · unverdicted · novelty 4.0 · 2 refs

A BNN-based YOLOv3-tiny-like object detector with 1-bit weights and 8-bit activations is implemented in Verilog on FPGA, achieving 39.6% mAP50 on VOC and 0.999964 correlation with the ONNX model in RTL simulation.

citing papers explorer

Showing 8 of 8 citing papers.