A transformer autoregressively models text and image tokens as one stream and produces competitive zero-shot text-to-image results at sufficient scale.
Using a linear annealing schedule for this typically led to divergence
1 Pith paper cite this work. Polarity classification is still indexing.
1
Pith paper citing it
fields
cs.CV 1years
2021 1verdicts
CONDITIONAL 1representative citing papers
citing papers explorer
-
Zero-Shot Text-to-Image Generation
A transformer autoregressively models text and image tokens as one stream and produces competitive zero-shot text-to-image results at sufficient scale.