Google’s Multilingual Neural Machine Translation System: Enabling Zero-Shot Translation

التفاصيل البيبلوغرافية
العنوان: Google’s Multilingual Neural Machine Translation System: Enabling Zero-Shot Translation
المؤلفون: Greg S. Corrado, Mike Schuster, Fernanda B. Viégas, Zhifeng Chen, Melvin Johnson, Nikhil Thorat, Jeffrey Dean, Quoc V. Le, Macduff Hughes, Martin Wattenberg, Yonghui Wu, Maxim Krikun
المصدر: Transactions of the Association for Computational Linguistics. 5:339-351
بيانات النشر: MIT Press - Journals, 2017.
سنة النشر: 2017
مصطلحات موضوعية: Linguistics and Language, Interlingua, Vocabulary, Machine translation, Computer science, Speech recognition, media_common.quotation_subject, 02 engineering and technology, computer.software_genre, Bridging (programming), Rule-based machine translation, Artificial Intelligence, 020204 information systems, 0202 electrical engineering, electronic engineering, information engineering, media_common, business.industry, Communication, Transfer-based machine translation, language.human_language, Computer Science Applications, Human-Computer Interaction, language, Computer-assisted translation, 020201 artificial intelligence & image processing, Artificial intelligence, business, computer, Natural language processing, Sentence
الوصف: We propose a simple solution to use a single Neural Machine Translation (NMT) model to translate between multiple languages. Our solution requires no changes to the model architecture from a standard NMT system but instead introduces an artificial token at the beginning of the input sentence to specify the required target language. Using a shared wordpiece vocabulary, our approach enables Multilingual NMT systems using a single model. On the WMT’14 benchmarks, a single multilingual model achieves comparable performance for English→French and surpasses state-of-theart results for English→German. Similarly, a single multilingual model surpasses state-of-the-art results for French→English and German→English on WMT’14 and WMT’15 benchmarks, respectively. On production corpora, multilingual models of up to twelve language pairs allow for better translation of many individual pairs. Our models can also learn to perform implicit bridging between language pairs never seen explicitly during training, showing that transfer learning and zero-shot translation is possible for neural translation. Finally, we show analyses that hints at a universal interlingua representation in our models and also show some interesting examples when mixing languages.
تدمد: 2307-387X
DOI: 10.1162/tacl_a_00065
URL الوصول: https://explore.openaire.eu/search/publication?articleId=doi_________::c5d1937b2d1cb257c81236e2bfac40c1
https://doi.org/10.1162/tacl_a_00065
Rights: OPEN
رقم الانضمام: edsair.doi...........c5d1937b2d1cb257c81236e2bfac40c1
قاعدة البيانات: OpenAIRE
الوصف
تدمد:2307387X
DOI:10.1162/tacl_a_00065