Recognition: unknown
Curriculum Learning for Domain Adaptation in Neural Machine Translation
classification
💻 cs.CL
keywords
domainneuralapproachcurriculumlearningmachinetranslationadapt
read the original abstract
We introduce a curriculum learning approach to adapt generic neural machine translation models to a specific domain. Samples are grouped by their similarities to the domain of interest and each group is fed to the training algorithm with a particular schedule. This approach is simple to implement on top of any neural framework or architecture, and consistently outperforms both unadapted and adapted baselines in experiments with two distinct domains and two language pairs.
This paper has not been read by Pith yet.
Forward citations
Cited by 1 Pith paper
-
Demystifying CLIP Data
MetaCLIP curates balanced 400M-pair subsets from CommonCrawl that outperform CLIP data, reaching 70.8% zero-shot ImageNet accuracy on ViT-B versus CLIP's 68.3%.
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.