pith. machine review for the scientific record. sign in

Data mixing laws: Optimizing data mixtures by predicting language modeling performance

6 Pith papers cite this work. Polarity classification is still indexing.

6 Pith papers citing it

years

2026 5 2024 1

representative citing papers

Scaling Laws for Mixture Pretraining Under Data Constraints

cs.LG · 2026-05-12 · conditional · novelty 7.0

Repetition-aware scaling laws show scarce target data in pretraining mixtures can be repeated 15-20 times optimally, with the best count depending on data size, compute, and model scale.

On the Invariance and Generality of Neural Scaling Laws

cs.LG · 2026-05-08 · unverdicted · novelty 7.0

Neural scaling laws are invariant under bijective data transformations and change predictably with information resolution ρ under non-bijective transformations, enabling cross-domain transport of fitted exponents.

Knowledge Transfer Scaling Laws for 3D Medical Imaging

cs.CV · 2026-05-07 · conditional · novelty 6.0

Transfer-aware data allocation derived from observed power-law scaling laws for asymmetric knowledge transfer in 3D medical imaging outperforms standard proportional sampling by up to 58% and generalizes to new budgets.

Evaluation-driven Scaling for Scientific Discovery

cs.LG · 2026-04-21 · unverdicted · novelty 6.0

SimpleTES scales test-time evaluation in LLMs to discover state-of-the-art solutions on 21 scientific problems across six domains, outperforming frontier models and optimization pipelines with examples like 2x faster LASSO and new Erdos constructions.

citing papers explorer

Showing 6 of 6 citing papers.