Google’s Multilingual Neural Machine Translation System: Enabling Zero-Shot Translation

We propose a simple solution to use a single Neural Machine Translation (NMT) model to translate between multiple languages. Our solution requires no changes to the model architecture from a standard NMT system but instead introduces an artificial token at the beginning of the input sentence. Using a shared word piece vocabulary, our approach enables Multilingual NMT systems using a single model.

Auto101

https://www.aclweb.org/anthology/Q17-1024.pdf