pith. machine review for the scientific record. sign in

arXiv preprint arXiv:2402.01032 , year=

7 Pith papers cite this work. Polarity classification is still indexing.

7 Pith papers citing it

fields

cs.LG 6 cs.CL 1

years

2026 6 2025 1

verdicts

UNVERDICTED 7

representative citing papers

The Recurrent Transformer: Greater Effective Depth and Efficient Decoding

cs.LG · 2026-04-23 · unverdicted · novelty 6.0

Recurrent Transformers add per-layer recurrent memory via self-attention on own activations plus a tiling algorithm that reduces training memory traffic, yielding better C4 pretraining cross-entropy than parameter-matched standard transformers with fewer layers.

citing papers explorer

Showing 7 of 7 citing papers.