Batch Normalization normalizes layer inputs per mini-batch to reduce internal covariate shift, allowing higher learning rates, less careful initialization, and faster convergence in deep networks.
Knowledge matters: Importance of prior information for optimization
1 Pith paper cite this work. Polarity classification is still indexing.
1
Pith paper citing it
fields
cs.LG 1years
2015 1verdicts
CONDITIONAL 1representative citing papers
citing papers explorer
-
Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift
Batch Normalization normalizes layer inputs per mini-batch to reduce internal covariate shift, allowing higher learning rates, less careful initialization, and faster convergence in deep networks.