pith. machine review for the scientific record. sign in

Sequence-level knowledge distillation

3 Pith papers cite this work. Polarity classification is still indexing.

3 Pith papers citing it

fields

cs.LG 2 cs.CL 1

years

2026 2 2023 1

representative citing papers

TIP: Token Importance in On-Policy Distillation

cs.LG · 2026-04-15 · conditional · novelty 6.0

In on-policy distillation, tokens with high student entropy or low entropy plus high teacher divergence provide dense corrective signal, allowing effective training on under 20% of tokens across math and planning tasks.

citing papers explorer

Showing 3 of 3 citing papers.