pith. machine review for the scientific record. sign in

Bridging kolmogorov com- plexity and deep learning: Asymptotically optimal description length objectives for transformers

1 Pith paper cite this work. Polarity classification is still indexing.

1 Pith paper citing it

fields

cs.LG 1

years

2026 1

verdicts

UNVERDICTED 1

representative citing papers

Neural Weight Norm = Kolmogorov Complexity

cs.LG · 2026-05-11 · unverdicted · novelty 7.0

Minimal weight norm of fixed-precision looped neural networks equals Kolmogorov complexity of output string up to log factor, making weight decay match the optimal universal prior up to polynomial factor.

citing papers explorer

Showing 1 of 1 citing paper.

  • Neural Weight Norm = Kolmogorov Complexity cs.LG · 2026-05-11 · unverdicted · none · ref 35

    Minimal weight norm of fixed-precision looped neural networks equals Kolmogorov complexity of output string up to log factor, making weight decay match the optimal universal prior up to polynomial factor.