pith. machine review for the scientific record. sign in

Unveiling super experts in mixture-of-experts large language models

2 Pith papers cite this work. Polarity classification is still indexing.

2 Pith papers citing it

fields

cs.AR 1 cs.LG 1

years

2026 2

verdicts

UNVERDICTED 2

representative citing papers

Preserving Long-Tailed Expert Information in Mixture-of-Experts Tuning

cs.LG · 2026-04-24 · unverdicted · novelty 7.0

A new SFT framework for MoE models combines bias-driven sparsification with gated condenser experts to retain long-tailed expert information, outperforming DenseMixer and ESFT by over 2.5% on math reasoning and commonsense QA benchmarks.

citing papers explorer

Showing 2 of 2 citing papers.

  • Preserving Long-Tailed Expert Information in Mixture-of-Experts Tuning cs.LG · 2026-04-24 · unverdicted · none · ref 27

    A new SFT framework for MoE models combines bias-driven sparsification with gated condenser experts to retain long-tailed expert information, outperforming DenseMixer and ESFT by over 2.5% on math reasoning and commonsense QA benchmarks.

  • Accelerating MoE with Dynamic In-Switch Computing on Multi-GPUs cs.AR · 2026-05-07 · unverdicted · none · ref 47

    DySHARP accelerates MoE expert parallelism via dynamic multimem addressing and token-centric kernel fusion to cut redundant traffic and deliver up to 1.79x speedup over prior in-switch solutions.