pith. machine review for the scientific record. sign in

hub

Financebench: A new benchmark for financial question answering

17 Pith papers cite this work. Polarity classification is still indexing.

17 Pith papers citing it

hub tools

years

2026 17

clear filters

representative citing papers

Training Transformers for KV Cache Compressibility

cs.LG · 2026-05-07 · unverdicted · novelty 6.0 · 2 refs

Training transformers with KV sparsification during continued pretraining produces representations that admit better post-hoc KV cache compression, improving quality under memory budgets for long-context tasks.

citing papers explorer

Showing 4 of 4 citing papers after filters.