pith. machine review for the scientific record. sign in

How to train data-efficient llms

8 Pith papers cite this work. Polarity classification is still indexing.

8 Pith papers citing it

citation-role summary

background 1

citation-polarity summary

years

2026 7 2023 1

roles

background 1

polarities

background 1

representative citing papers

KoCo: Conditioning Language Model Pre-training on Knowledge Coordinates

cs.CL · 2026-04-14 · unverdicted · novelty 6.0

KoCo conditions LLM pre-training by prepending three-dimensional semantic coordinates to documents, improving performance on 10 downstream tasks, accelerating convergence by 30%, and helping distinguish facts from noise to reduce hallucinations.

A Survey of Large Language Models

cs.CL · 2023-03-31 · accept · novelty 3.0

This survey reviews the background, key techniques, and evaluation methods for large language models, emphasizing emergent abilities that appear at large scales.

citing papers explorer

Showing 8 of 8 citing papers.