Gopher, a 280 billion parameter language model, achieves state-of-the-art performance on the majority of 152 tasks with largest gains in reading comprehension, fact-checking, and toxic language detection.
The local hub of <Attribute> culture is known for
1 Pith paper cite this work. Polarity classification is still indexing.
1
Pith paper citing it
fields
cs.CL 1years
2021 1verdicts
UNVERDICTED 1representative citing papers
citing papers explorer
-
Scaling Language Models: Methods, Analysis & Insights from Training Gopher
Gopher, a 280 billion parameter language model, achieves state-of-the-art performance on the majority of 152 tasks with largest gains in reading comprehension, fact-checking, and toxic language detection.