pith. machine review for the scientific record. sign in

EMNLP , year=

1 Pith paper cite this work. Polarity classification is still indexing.

1 Pith paper citing it

fields

cs.LG 1

years

2026 1

verdicts

CONDITIONAL 1

representative citing papers

Scaling Laws for Mixture Pretraining Under Data Constraints

cs.LG · 2026-05-12 · conditional · novelty 7.0

Repetition-aware scaling laws show scarce target data in pretraining mixtures can be repeated 15-20 times optimally, with the best count depending on data size, compute, and model scale.

citing papers explorer

Showing 1 of 1 citing paper.

  • Scaling Laws for Mixture Pretraining Under Data Constraints cs.LG · 2026-05-12 · conditional · none · ref 5

    Repetition-aware scaling laws show scarce target data in pretraining mixtures can be repeated 15-20 times optimally, with the best count depending on data size, compute, and model scale.