pith. machine review for the scientific record. sign in

arxiv: 1710.11041 · v2 · submitted 2017-10-30 · 💻 cs.CL · cs.AI· cs.LG

Recognition: unknown

Unsupervised Neural Machine Translation

Authors on Pith no claims yet
classification 💻 cs.CL cs.AIcs.LG
keywords corporaparallelmodeltranslationunsupervisedcompletelymachinemonolingual
0
0 comments X
read the original abstract

In spite of the recent success of neural machine translation (NMT) in standard benchmarks, the lack of large parallel corpora poses a major practical problem for many language pairs. There have been several proposals to alleviate this issue with, for instance, triangulation and semi-supervised learning techniques, but they still require a strong cross-lingual signal. In this work, we completely remove the need of parallel data and propose a novel method to train an NMT system in a completely unsupervised manner, relying on nothing but monolingual corpora. Our model builds upon the recent work on unsupervised embedding mappings, and consists of a slightly modified attentional encoder-decoder model that can be trained on monolingual corpora alone using a combination of denoising and backtranslation. Despite the simplicity of the approach, our system obtains 15.56 and 10.21 BLEU points in WMT 2014 French-to-English and German-to-English translation. The model can also profit from small parallel corpora, and attains 21.81 and 15.24 points when combined with 100,000 parallel sentences, respectively. Our implementation is released as an open source project.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 2 Pith papers

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. CodeBLEU: a Method for Automatic Evaluation of Code Synthesis

    cs.SE 2020-09 conditional novelty 7.0

    CodeBLEU improves correlation with human programmer scores on code synthesis tasks by adding syntactic AST matching and semantic data-flow matching to the standard BLEU n-gram approach.

  2. SentencePiece: A simple and language independent subword tokenizer and detokenizer for Neural Text Processing

    cs.CL 2018-08 accept novelty 7.0

    SentencePiece trains subword models directly from raw text to enable language-independent neural text processing.