pith. machine review for the scientific record. sign in

Do not blindly imitate the teacher: Using perturbed loss for knowledge distillation.arXiv preprint arXiv:2305.05010

1 Pith paper cite this work. Polarity classification is still indexing.

1 Pith paper citing it

fields

cs.CL 1

years

2023 1

verdicts

CONDITIONAL 1

representative citing papers

MiniLLM: On-Policy Distillation of Large Language Models

cs.CL · 2023-06-14 · conditional · novelty 6.0

MiniLLM distills large language models into smaller ones via reverse KL divergence and on-policy optimization, yielding higher-quality responses with lower exposure bias than standard KD baselines.

citing papers explorer

Showing 1 of 1 citing paper.

  • MiniLLM: On-Policy Distillation of Large Language Models cs.CL · 2023-06-14 · conditional · none · ref 25

    MiniLLM distills large language models into smaller ones via reverse KL divergence and on-policy optimization, yielding higher-quality responses with lower exposure bias than standard KD baselines.