pith. machine review for the scientific record. sign in

Black-box on-policy distillation of large language models.arXiv preprint arXiv:2511.10643

8 Pith papers cite this work. Polarity classification is still indexing.

8 Pith papers citing it

fields

cs.CL 5 cs.LG 3

years

2026 8

representative citing papers

Rubric-based On-policy Distillation

cs.LG · 2026-05-08 · unverdicted · novelty 7.0

Rubric-based on-policy distillation allows training student models using only teacher responses by generating scoring rubrics from contrasts and using them for on-policy optimization, achieving superior performance and up to 10x better sample efficiency than logit-based approaches.

citing papers explorer

Showing 8 of 8 citing papers.