pith. machine review for the scientific record. sign in

7th International Conference on Learning Representations

1 Pith paper cite this work. Polarity classification is still indexing.

1 Pith paper citing it

citation-role summary

background 1

citation-polarity summary

fields

cs.LG 1

years

2020 1

verdicts

UNVERDICTED 1

roles

background 1

polarities

background 1

representative citing papers

Rethinking Attention with Performers

cs.LG · 2020-09-30 · unverdicted · novelty 7.0

Performers approximate full-rank softmax attention in Transformers via FAVOR+ random features for linear complexity, with theoretical guarantees of unbiased estimation and competitive results on pixel, text, and protein tasks.

citing papers explorer

Showing 1 of 1 citing paper.

  • Rethinking Attention with Performers cs.LG · 2020-09-30 · unverdicted · none · ref 37

    Performers approximate full-rank softmax attention in Transformers via FAVOR+ random features for linear complexity, with theoretical guarantees of unbiased estimation and competitive results on pixel, text, and protein tasks.