pith. machine review for the scientific record. sign in

hub

BERT Rediscovers the Classical NLP Pipeline , publisher =

12 Pith papers cite this work. Polarity classification is still indexing.

12 Pith papers citing it

hub tools

years

2026 10 2022 2

representative citing papers

OPT: Open Pre-trained Transformer Language Models

cs.CL · 2022-05-02 · unverdicted · novelty 7.0

OPT releases open decoder-only transformers up to 175B parameters that match GPT-3 performance at one-seventh the carbon cost, along with code and training logs.

A Layer-wise Analysis of Supervised Fine-Tuning

cs.LG · 2026-04-12 · unverdicted · novelty 6.0

Middle layers (20-80%) remain stable during SFT while final layers are sensitive, enabling Mid-Block Efficient Tuning that outperforms LoRA by up to 10.2% on GSM8K with reduced parameter count.

citing papers explorer

Showing 12 of 12 citing papers.