pith. machine review for the scientific record. sign in

Striped attention: Faster ring attention for causal transformers

6 Pith papers cite this work. Polarity classification is still indexing.

6 Pith papers citing it

citation-role summary

background 1

citation-polarity summary

fields

cs.LG 4 cs.DC 2

verdicts

UNVERDICTED 6

roles

background 1

polarities

background 1

representative citing papers

Kaczmarz Linear Attention

cs.LG · 2026-05-09 · unverdicted · novelty 5.0

Kaczmarz Linear Attention replaces the empirical coefficient in Gated DeltaNet with a key-norm-normalized step size derived from the online regression objective, yielding lower perplexity and better needle-in-haystack performance.

citing papers explorer

Showing 6 of 6 citing papers.