Maximum softmax probability acts as a baseline for detecting misclassified and out-of-distribution examples in neural networks.
Title resolution pending
2 Pith papers cite this work. Polarity classification is still indexing.
2
Pith papers citing it
years
2016 2representative citing papers
GELU activation xΦ(x) outperforms ReLU and ELU on computer vision, NLP, and speech tasks by weighting inputs by value rather than gating by sign.
citing papers explorer
-
A Baseline for Detecting Misclassified and Out-of-Distribution Examples in Neural Networks
Maximum softmax probability acts as a baseline for detecting misclassified and out-of-distribution examples in neural networks.
-
Gaussian Error Linear Units (GELUs)
GELU activation xΦ(x) outperforms ReLU and ELU on computer vision, NLP, and speech tasks by weighting inputs by value rather than gating by sign.