pith. machine review for the scientific record. sign in

citation dossier

An empirical investigation of catastrophic forgetting in gradient-based neural networks.arXiv preprint arXiv:1312.6211

22 I · 2013 · arXiv 1312.6211

16Pith papers citing it
17reference links
cs.LGtop field · 10 papers
UNVERDICTEDtop verdict bucket · 15 papers

This arXiv-backed work is queued for full Pith review when it crosses the high-inbound sweep. That review runs reader · skeptic · desk-editor · referee · rebuttal · circularity · lean confirmation · RS check · pith extraction.

read on arXiv PDF

why this work matters in Pith

Pith has found this work in 16 reviewed papers. Its strongest current cluster is cs.LG (10 papers). The largest review-status bucket among citing papers is UNVERDICTED (15 papers). For highly cited works, this page shows a dossier first and a bounded explorer second; it never tries to render every citing paper at once.

years

2026 15 2023 1

representative citing papers

Debiasing LLMs by Fine-tuning

q-fin.GN · 2026-04-03 · unverdicted · novelty 6.0

Supervised fine-tuning with LoRA on rational benchmark forecasts corrects extrapolation bias out-of-sample in LLM predictions for controlled experiments and cross-sectional stock returns.

Online Generalised Predictive Coding

stat.ML · 2026-05-04 · unverdicted · novelty 5.0

Online generalised predictive coding (ODEM) tracks latent states in nonlinear and chaotic generative models by separating temporal scales for fast Bayesian belief updating and slow parameter learning.

citing papers explorer

Showing 16 of 16 citing papers.