pith. machine review for the scientific record. sign in

MemoryFormer: Minimize transformer computation by removing fully-connected layers

1 Pith paper cite this work. Polarity classification is still indexing.

1 Pith paper citing it

fields

cs.LG 1

years

2026 1

verdicts

UNVERDICTED 1

representative citing papers

Graph Memory Transformer (GMT)

cs.LG · 2026-04-26 · unverdicted · novelty 5.0

Graph Memory Transformer (GMT) swaps dense FFN sublayers for a graph of 128 centroids and a learned 128x128 transition matrix per block, yielding a 82M-parameter decoder-only LM that trains stably but trails a 103M dense baseline in perplexity.

citing papers explorer

Showing 1 of 1 citing paper.

  • Graph Memory Transformer (GMT) cs.LG · 2026-04-26 · unverdicted · none · ref 22

    Graph Memory Transformer (GMT) swaps dense FFN sublayers for a graph of 128 centroids and a learned 128x128 transition matrix per block, yielding a 82M-parameter decoder-only LM that trains stably but trails a 103M dense baseline in perplexity.