pith. machine review for the scientific record. sign in

Minillm: Knowledge distillation of large language models

6 Pith papers cite this work. Polarity classification is still indexing.

6 Pith papers citing it

years

2026 6

representative citing papers

GRAFT: Graph-Tokenized LLMs for Tool Planning

cs.LG · 2026-05-12 · unverdicted · novelty 6.0

GRAFT internalizes tool dependency graphs via dedicated special tokens in LLMs and applies on-policy context distillation to achieve higher exact sequence matching and dependency legality than prior external-graph methods.

Flow-OPD: On-Policy Distillation for Flow Matching Models

cs.CV · 2026-05-08 · unverdicted · novelty 6.0 · 2 refs

Flow-OPD applies on-policy distillation to flow matching models, achieving GenEval of 92 and OCR accuracy of 94 on Stable Diffusion 3.5 Medium while avoiding the seesaw effect of multi-reward optimization.

citing papers explorer

Showing 6 of 6 citing papers.