pith. machine review for the scientific record. sign in

Title resolution pending

2 Pith papers cite this work. Polarity classification is still indexing.

2 Pith papers citing it

fields

cs.CL 1 cs.LG 1

years

2020 1 2017 1

verdicts

ACCEPT 2

representative citing papers

Reformer: The Efficient Transformer

cs.LG · 2020-01-13 · accept · novelty 8.0

Reformer matches standard Transformer accuracy on long sequences while using far less memory and running faster via LSH attention and reversible residual layers.

citing papers explorer

Showing 2 of 2 citing papers.

  • Reformer: The Efficient Transformer cs.LG · 2020-01-13 · accept · none · ref 5

    Reformer matches standard Transformer accuracy on long sequences while using far less memory and running faster via LSH attention and reversible residual layers.

  • TriviaQA: A Large Scale Distantly Supervised Challenge Dataset for Reading Comprehension cs.CL · 2017-05-09 · accept · none · ref 2

    TriviaQA is a new large-scale dataset for reading comprehension that features complex compositional questions, high lexical variability, and cross-sentence reasoning requirements, where current baselines reach only 40% while humans reach 80%.