pith. machine review for the scientific record. sign in

Title resolution pending

1 Pith paper cite this work. Polarity classification is still indexing.

1 Pith paper citing it

fields

cs.LG 1

years

2026 1

verdicts

ACCEPT 1

representative citing papers

Dispatch-Aware Ragged Attention for Pruned Vision Transformers

cs.LG · 2026-04-16 · accept · novelty 6.0

A new Triton kernel for dispatch-aware ragged attention delivers 1.88-2.51× end-to-end throughput gains over standard padded attention and 9-12% over FlashAttention-2 varlen in pruned ViTs by lowering dispatch floor to ~24μs.

citing papers explorer

Showing 1 of 1 citing paper.

  • Dispatch-Aware Ragged Attention for Pruned Vision Transformers cs.LG · 2026-04-16 · accept · none · ref 7

    A new Triton kernel for dispatch-aware ragged attention delivers 1.88-2.51× end-to-end throughput gains over standard padded attention and 9-12% over FlashAttention-2 varlen in pruned ViTs by lowering dispatch floor to ~24μs.