pith. machine review for the scientific record. sign in

Title resolution pending

1 Pith paper cite this work. Polarity classification is still indexing.

1 Pith paper citing it

fields

eess.SP 1

years

2026 1

verdicts

UNVERDICTED 1

representative citing papers

BFLA: Block-Filtered Long-Context Attention Mechanism

eess.SP · 2026-05-12 · unverdicted · novelty 4.0

BFLA is a two-stage block-filtered sparse prefill attention mechanism that constructs an input-dependent block mask and applies tile-level rescues to skip unimportant KV tiles while preserving exact attention inside retained tiles, delivering speedups on models like Llama 3.1 with minimal accuracy 0

citing papers explorer

Showing 1 of 1 citing paper.

  • BFLA: Block-Filtered Long-Context Attention Mechanism eess.SP · 2026-05-12 · unverdicted · none · ref 13

    BFLA is a two-stage block-filtered sparse prefill attention mechanism that constructs an input-dependent block mask and applies tile-level rescues to skip unimportant KV tiles while preserving exact attention inside retained tiles, delivering speedups on models like Llama 3.1 with minimal accuracy 0