Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • NEWS AND VIEWS
  • 05 June 2024

Meta’s AI translation model embraces overlooked languages

  • David I. Adelani 0

David I. Adelani is in the Department of Computer Science, University College London Centre for Artificial intelligence, London WC1V 6BH, UK; in the School of Computer Science, McGill University, Montreal, Quebec, Canada, and at Mila — Quebec AI Institute, Montreal, Quebec, Canada.

You can also search for this author in PubMed   Google Scholar

Machine-translation models use artificial intelligence (AI) to translate one human language into another — a worthy feat, given the potential for enhanced communication to break down the barriers posed by differences in language and culture. Yet most of these models can interpret only a small fraction of the world’s languages, in part because training them requires online data that don’t exist for many languages. The US technology company Meta has designed a project called No Language Left Behind (NLLB) to change that. Writing in Nature , the NLLB team 1 presents a publicly available model that can translate between 204 languages, many of which are used in low- and middle-income countries.

Access options

Access Nature and 54 other Nature Portfolio journals

Get Nature+, our best-value online-access subscription

24,99 € / 30 days

cancel any time

Subscribe to this journal

Receive 51 print issues and online access

185,98 € per year

only 3,65 € per issue

Rent or buy this article

Prices vary by article type

Prices may be subject to local taxes which are calculated during checkout

doi: https://doi.org/10.1038/d41586-024-00964-2

NLLB Team. Nature https://doi.org/10.1038/s41586-024-07335-x (2024).

Article   Google Scholar  

Bahdanau, D., Cho, K. & Bengio, Y. Preprint at arXiv https://doi.org/10.48550/arXiv.1409.0473 (2014).

Sennrich, R., Haddow, B. & Birch, A. Proc. 54th Annu. Meet. Assoc. Comput. Linguist. (Vol. 1: Long Papers), 1715–1725 (2016).

Vaswani, A. et al. Adv. Neural Inf. Process. Syst. 30 , 6000–6010 (2017).

Google Scholar  

Conneau, A. et al. Proc. 58th Annu. Meet. Assoc. Comput. Linguist . 8440–8451 (2020).

Goyal, N. et al. Trans. Assoc. Comput. Linguist. 10 , 522–538 (2022).

Adelani, D. I. et al. Preprint at arXiv https://doi.org/10.48550/arXiv.2309.07445 (2024).

Robinson, N., Ogayo, P., Mortensen, D. R. & Neubig, G. Proc. Eighth Conf. Mach. Transl. 392–418 (2023).

Bandarkar, L. et al. Preprint at arXiv https://doi.org/10.48550/arXiv.2308.16884 (2023).

Adelani, D. et al. Proc. 2022 Conf. North Am. Chapter Assoc. Comput. Linguist. Hum. Lang. Technol . 3053–3070 (2022).

Maillard, J. et al. Proc. 61st Annu. Meet. Assoc. Comput. Linguist . (Vol. 1: Long Papers) 2740–2756 (2023).

Reid, M. et al. Preprint at arXiv https://doi.org/10.48550/arXiv.2403.05530 (2024).

Zhang, K. et al. Preprint at arXiv https://doi.org/10.48550/arXiv.2402.18025 (2024).

Download references

Reprints and permissions

Competing Interests

The author declares no competing interests.

Related Articles

research on machine translation

See all News & Views

  • Machine learning
  • Computer science

Need a policy for using ChatGPT in the classroom? Try asking students

Need a policy for using ChatGPT in the classroom? Try asking students

Career Column 05 JUN 24

Meta’s AI system is a boost to endangered languages — as long as humans aren’t forgotten

Meta’s AI system is a boost to endangered languages — as long as humans aren’t forgotten

Editorial 05 JUN 24

Superfast Microsoft AI is first to predict air pollution for the whole world

Superfast Microsoft AI is first to predict air pollution for the whole world

News 04 JUN 24

Scaling neural machine translation to 200 languages

Scaling neural machine translation to 200 languages

Article 05 JUN 24

Accelerating AI: the cutting-edge chips powering the computing revolution

Accelerating AI: the cutting-edge chips powering the computing revolution

News Feature 04 JUN 24

Who owns your voice? Scarlett Johansson OpenAI complaint raises questions

Who owns your voice? Scarlett Johansson OpenAI complaint raises questions

News Explainer 29 MAY 24

Faculty Positions in School of Engineering, Westlake University

The School of Engineering (SOE) at Westlake University is seeking to fill multiple tenured or tenure-track faculty positions in all ranks.

Hangzhou, Zhejiang, China

Westlake University

research on machine translation

High-Level Talents at the First Affiliated Hospital of Nanchang University

For clinical medicine and basic medicine; basic research of emerging inter-disciplines and medical big data.

Nanchang, Jiangxi, China

The First Affiliated Hospital of Nanchang University

research on machine translation

Professor/Associate Professor/Assistant Professor/Senior Lecturer/Lecturer

The School of Science and Engineering (SSE) at The Chinese University of Hong Kong, Shenzhen (CUHK-Shenzhen) sincerely invites applications for mul...

Shenzhen, China

The Chinese University of Hong Kong, Shenzhen (CUHK Shenzhen)

research on machine translation

Faculty Positions& Postdoctoral Research Fellow, School of Optical and Electronic Information, HUST

Job Opportunities: Leading talents, young talents, overseas outstanding young scholars, postdoctoral researchers.

Wuhan, Hubei, China

School of Optical and Electronic Information, Huazhong University of Science and Technology

research on machine translation

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

Scientific Credibility of Machine Translation Research: A Meta-Evaluation of 769 Papers

Benjamin Marie , Atsushi Fujita , Raphael Rubino

Export citation

  • Preformatted

Markdown (Informal)

[Scientific Credibility of Machine Translation Research: A Meta-Evaluation of 769 Papers](https://aclanthology.org/2021.acl-long.566) (Marie et al., ACL-IJCNLP 2021)

  • Scientific Credibility of Machine Translation Research: A Meta-Evaluation of 769 Papers (Marie et al., ACL-IJCNLP 2021)
  • Benjamin Marie, Atsushi Fujita, and Raphael Rubino. 2021. Scientific Credibility of Machine Translation Research: A Meta-Evaluation of 769 Papers . In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers) , pages 7297–7306, Online. Association for Computational Linguistics.

Neural machine translation: Challenges, progress and future

  • Published: 15 September 2020
  • Volume 63 , pages 2028–2050, ( 2020 )

Cite this article

research on machine translation

  • JiaJun Zhang 1 , 2 &
  • ChengQing Zong 1 , 2 , 3  

1268 Accesses

29 Citations

Explore all metrics

Machine translation (MT) is a technique that leverages computers to translate human languages automatically. Nowadays, neural machine translation (NMT) which models direct mapping between source and target languages with deep neural networks has achieved a big breakthrough in translation performance and become the de facto paradigm of MT. This article makes a review of NMT framework, discusses the challenges in NMT, introduces some exciting recent progresses and finally looks forward to some potential future research trends.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

Similar content being viewed by others

research on machine translation

Deep Learning in Machine Translation

research on machine translation

Text-Text Neural Machine Translation: A Survey

research on machine translation

Neural machine translation: past, present, and future

Weaver W. Translation. Machine Trans Languages, 1955, 14: 15–23

Google Scholar  

Nagao M. A framework of a mechanical translation between Japanese and english by analogy principle. In: Proceedings of the International NATO Symposium on Artificial and Human Intelligence. Lyon, 1984. 173–180

Brown P F, Pietra V J D, Pietra S A D, et al. The mathematics of statistical machine translation: Parameter estimation. Comput Linguist, 1993. 19: 263–311

Koehn P, Och F J, Marcu D. Statistical phrase-based translation. In: Proceedings of the Annual Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Edmonton, 2003. 48–54

Chiang D. A hierarchical phrase-based model for statistical machine translation. In: Proceedings of the Annual Meeting of the Association for Computational Linguistics. Ann Arbor, 2005. 263–270

Sutskever I, Vinyals O, Le Q V. Sequence to sequence learning with neural networks. In: Proceedings of the Conference on Neural Information Processing Systems. Montreal, 2014. 3104–3112

Bahdanau D, Cho K, Bengio Y. Neural machine translation by jointly learning to align and translate. In: Proceedings of the International Conference on Learning Representations. San Diego, 2015

Gehring J, Auli M, Grangier D, et al. Convolutional sequence to sequence learning. In: Proceedings of the International Conference on Machine Learning. Sydney, 2017. 1243–1252

Vawani A, Shazeer N, Parmar N, et al. Attention is all you need. In: Proceedings of the Conference on Neural Information Processing Systems. Long Beach, 2017. 5998–6008

Junczys-Dowmunt M, Dwojak T, Hoang H. Is neural machine translation ready for deployment? A case study on 30 translation directions. ArXiv: 1610.01108

Wu Y, Schuster M, Chen Z, et al. Google’s neural machine translation system: Bridging the gap between human and machine translation. ArXiv: 1609.08144

Hassan H, Aue A, Chen C, et al. Achieving human parity on automatic chinese to english news translation. ArXiv: 1803.05567

Sennrich R, Haddow B, Birch A. Neural machine translation of rare words with subword units. In: Proceedings of the Annual Meeting of the Association for Computational Linguistics. Berlin, 2016. 1715–1725

Chen M X, Firat O, Bapna A, et al. The best of both worlds: Combining recent advances in neural machine translation. In: Proceedings of the Annual Meeting of the Association for Computational Linguistics. Melbourne, 2018. 76–86

Wang Q, Li B, Xiao T, et al. Learning deep transformer models for machine translation. In: Proceedings of the Annual Meeting of the Association for Computational Linguistics. Florence, 2019. 1810–1822

Zhang B, Titov I, Sennrich R. Improving deep transformer with depth-scaled initialization and merged attention. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing and the International Joint Conference on Natural Language Processing. Hong Kong, 2019. 898–909

Li Y, Wang Q, Xiao T, et al. Neural machine translation with joint representation. In: Proceedings of the AAAI Conference on Artificial Intelligence. New York, 2020. 8285–8292

Wu F, Fan A, Baevski A, et al. Pay less attention with lightweight and dynamic convolutions. In: Proceedings of the International Conference on Learning Representations. New Orleans, 2019

So D R, Liang C, Le Q V. The evolved transformer. In: Proceedings of the International Conference on Machine Learning. Long Beach, 2019. 5877–5886

Lu Y, Li Z, He D, et al. Understanding and improving transformer from a multi-particle dynamic system point of view. ArXiv: 1906.02762

Zhang J, Zong C. Deep neural networks in machine translation: An overview. IEEE Intell Syst, 2015, 30: 16

Article   Google Scholar  

Liu Y, Zhang J. Deep learning in machine translation. In: Deep Learning in Natural Language Processing. Singapore: Springer, 2018. 1147–183

Koehn P, Knowles R. Six challenges for neural machine translation. ArXiv: 1706.03872

Zhang J, Luan H, Sun M, et al. Improving the transformer translation model with document-level context. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing. Brussels, 2018. 533–542

Xiong H, He Z, Wu H, et al. Modeling coherence for discourse neural machine translation. In: Proceedings of the AAAI Conference on Artificial Intelligence. Honolulu, 2019. 7338–7345

Gong Z, Zhang M, Zhou G. Cache-based document-level statistical machine translation. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing. Edinburgh, 2011. 909–919

Xiao T, Zhu J, Yao S, et al. Document-level consistency verification in machine translation. In: Proceedings of the 13th Machine Translation Summit. Xiamen, 2011. 131–138

Xiong D, Ding Y, Zhang M, et al. Lexical chain based cohesion models for document-level statistical machine translation. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing. Seattle, 2013. 1563–1573

Born L, Mesgar M, Strube M. Using a graph-based coherence model in document-level machine translation. In: Proceedings of the Third Workshop on Discourse in Machine Translation. Copenhagen, 2017. 26–35

Rios A, Tuggener D. Co-reference resolution of elided subjects and possessive pronouns in spanish-english statistical machine translation. In: Proceedings of the Conference of the European Chapter of the Association for Computational Linguistics. Valencia, 2017. 657–662

Kuang S, Xiong D, Luo W, et al. Modeling coherence for neural machine translation with dynamic and topic caches. In: Proceedings of the International Conference on Computational Linguistics. Santa Fe, New Mexico, 2018. 596–606

Tu Z, Liu Y, Shi S, et al. Learning to remember translation history with a continuous cache. Trans Assoc Comput Linguist, 2018, 6: 407–420

Jean S, Lauly S, Firat O, et al. Does neural machine translation benefit from larger context? ArXiv: 1704.05135

Wang L, Tu Z, Way A, et al. Exploiting cross-sentence context for neural machine translation. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing. Copenhagen, 2017. 2826–2831

Voita E, Serdyukov P, Sennrich R, et al. Context-aware neural machine translation learns anaphora resolution. In: Proceedings of the Annual Meeting of the Association for Computational Linguistics. Melbourne, 2018. 1264–1274

Miculicich L, Ram D, Pappas N, et al. Document-level neural machine translation with hierarchical attention networks. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing. Brussels, 2018. 2947–2954

Yang Z, Zhang J, Meng F, et al. Enhancing context modeling with a query-guided capsule network for document-level translation. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing and the International Joint Conference on Natural Language Processing. Hong Kong, 2019. 1527–1537

Maruf S, Haffari G. Document context neural machine translation with memory networks. In: Proceedings of the Annual Meeting of the Association for Computational Linguistics. Melbourne, 2018. 1275–1284

Maruf S, Martins A F, Haffari G. Selective attention for context-aware neural machine translation. In: Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Minneapolis, 2019. 3092–3102

Tan X, Zhang L, Xiong D, et al. Hierarchical modeling of global context for document-level neural machine translation. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing and the International Joint Conference on Natural Language Processing. Hong Kong, 2019. 1576–1585

Tiedemann J, Scherrer Y. Neural machine translation with extended context. In: Proceedings of the Third Workshop on Discourse in Machine Translation. Copenhagen, 2017. 82–92

Bawden R, Sennrich R, Birch A, et al. Evaluating discourse phenomena in neural machine translation. In: Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. New Orleans, 2018. 1304–1313

Voita E, Sennrich R, Titov I. When a good translation is wrong in context: Context-aware machine translation improves on deixis, ellipsis, and lexical cohesion. In: Proceedings of the Annual Meeting of the Association for Computational Linguistics. Florence, 2019. 1198–1212

Voita E, Sennrich R, Titov I. Context-aware monolingual repair for neural machine translation. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing and the International Joint Conference on Natural Language Processing. Hong Kong, 2019. 877–886

Gu J, Bradbury J, Xiong C, et al. Non-autoregressive neural machine translation. In: Proceedings of the International Conference on Learning Representations. Vancouver, 2018

Zhang X, Su J, Qin Y, et al. Asynchronous bidirectional decoding for neural machine translation. In: Proceedings of the AAAI Conference on Artificial Intelligence. New Orleans, 2018. 5698–5705

Zhou L, Zhang J, Zong C. Synchronous bidirectional neural machine translation. Trans. Association Comput Linguist, 2019, 7: 91–105

Lee J, Mansimov E, Cho K. Deterministic non-autoregressive neural sequence modeling by iterative refinement. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing. Brussels, 2018. 1173–1182

Wang C, Zhang J, Chen H. Semi-autoregressive neural machine translation. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing. Brussels, 2018. 479–488

Guo J, Tan X, He D, et al. Non-autoregressive neural machine translation with enhanced decoder input. In: Proceedings of the AAAI Conference on Artificial Intelligence. Honolulu, 2019. 3723–3730

Shao C, Feng Y, Zhang J, et al. Retrieving sequential information for non-autoregressive neural machine translation. In: Proceedings of the Annual Meeting of the Association for Computational Linguistics. Florence, 2019. 3013–3024

Wang Y, Tian F, He D, et al. Non-autoregressive machine translation with auxiliary regularization. In: Proceedings of the AAAI Conference on Artificial Intelligence. Honolulu, 2019. 5377–5384

Wei B, Wang M, Zhou H, et al. Imitation learning for non-autoregressive neural machine translation. In: Proceedings of the Annual Meeting of the Association for Computational Linguistics. Florence, 2019. 1304–1312

Zheng Z, Zhou H, Huang S, et al. Modeling past and future for neural machine translation. Trans Assoc Comput Linguist, 2018, 6: 145–157

Zheng Z, Huang S, Tu Z, et al. Dynamic past and future for neural machine translation. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing and the International Joint Conference on Natural Language Processing. Hong Kong, 2019. 931–941

Zhang B, Xiong D, Su J, et al. Future-aware knowledge distillation for neural machine translation. IEEE/ACM Trans Audio Speech Lang Process, 2019, 27: 2278–2287

Liu L, Utiyama M, Finch A, et al. Agreement on target-bidirectional neural machine translation. In: Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. San Diego, 2016. 411–416

Hoang C D V, Haffari G, Cohn T. Decoding as continuous optimization in neural machine translation. ArXiv: 1701.02854

Zhang Z, Wu S, Liu S, et al. Regularizing neural machine translation by target-bidirectional agreement. In: Proceedings of the AAAI Conference on Artificial Intelligence. Honolulu, 2019. 443–450

Sennrich R, Haddow B, Birch A. Edinburgh neural machine translation systems for WMT 16. In: Proceedings of the First Conference on Machine Translation. Berlin, 2016. 371–376

Sennrich R, Birch A, Currey A, et al. The university of edinburgh’s neural mt systems for WMT 17. In: Proceedings of the First Conference on Machine Translation. Copenhagen, 2017. 389–399

Su J, Zhang X, Lin Q, et al. Exploiting reverse target-side contexts for neural machine translation via asynchronous bidirectional decoding. Artificial Intelligence, 2019, 277: 103168

Zhou L, Zhang J, Zong C, et al. Sequence generation: From both sides to the middle. In: Proceedings of the International Joint Conference on Artificial Intelligence. Macau, 2019. 5471–5477

Zhang J, Zhou L, Zhao Y,et al. Synchronous bidirectional inference for neural sequence generation. Artificial Intelligence, 2020, 281: 103234

Article   MathSciNet   Google Scholar  

Wu H, Wang H. Pivot language approach for phrase-based statistical machine translation. Machine Trans, 2007, 21: 165–181

Cheng Y, Yang Q, Liu Y, et al. Joint training for pivot-based neural machine translation. In: Proceedings of the International Joint Conference on Artificial Intelligence. Melbourne, 2017. 3974–3980

Dong D, Wu H, He W, et al. Multi-task learning for multiple language translation. In: Proceedings of the Annual Meeting of the Association for Computational Linguistics and the International Joint Conference on Natural Language Processing. Beijing, 2015. 1723–1732

Zoph B, Knight K. Multi-source neural translation. In: Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. San Diego, 2016. 30–34

Firat O, Cho K, Bengio Y. Multi-way, multilingual neural machine translation with a shared attention mechanism. In: Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. San Diego, 2016. 866–875

Firat O, Cho K, Sankaran B, et al. Multi-way, multilingual neural machine translation. Comput. Speech Language, 2017, 45: 236–252

Johnson M, Schuster M, Le Q V, et al. Googles multilingual neural machine translation system: Enabling zero-shot translation. TACL, 2017, 5: 339–351

Blackwood G, Ballesteros M, Ward T. Multilingual neural machine translation with task-specific attention. In: Proceedings of the International Conference on Computational Linguistics. Santa Fe, 2018. 3112–3122

Wang Y, Zhang J, Zhai F, et al. Three strategies to improve one-to-many multilingual translation. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing. Brussels, 2018. 2955–2960

Platanios E A, Sachan M, Neubig G, et al. Contextual parameter generation for universal neural machine translation. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing. Brussels, 2018. 425–435

Tan X, Chen J, He D, et al. Multilingual neural machine translation with language clustering. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing and the International Joint Conference on Natural Language Processing. Hong Kong, 2019. 963–973

Wang Y, Zhang J, Zhou L, et al. Synchronously generating two languages with interactive decoding. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing and the International Joint Conference on Natural Language Processing. Hong Kong, 2019. 3341–3346

Wang Y, Zhou L, Zhang J, et al. A compact and language-sensitive multilingual translation method. In: Proceedings of the Annual Meeting of the Association for Computational Linguistics. Florence, 2019. 1213–1223

Press O, Wolf L. Using the output embedding to improve language models. In: Proceedings of the Conference of the European Chapter of the Association for Computational Linguistics. Valencia, 2017. 157–163

Koehn P, Hoang H, Birch A, et al. Moses: Open source toolkit for statistical machine translation. In: Proceedings of the Annual Meeting of the Association for Computational Linguistics Companion Volume Proceedings of the Demo and Poster Sessions. Prague, 2007. 177–180

Gulcehre C, Firat O, Xu K, et al. On using monolingual corpora in neural machine translation. ArXiv: 1503.03535

Gulcehre C, Firat O, Xu K, et al. On integrating a language model into neural machine translation. Comput Speech Language, 2017, 45: 137–148

Sennrich R, Haddow B, Birch A. Improving neural machine translation models with monolingual data. In: Proceedings of the Annual Meeting of the Association for Computational Linguistics. Berlin, 2016. 86–96

Hoang V C D, Koehn P, Haffari G, et al. Iterative back-translation for neural machine translation. In: Proceedings of the 2nd Workshop on Neural Machine Translation and Generation. Melbourne, 2018. 18–24

Edunov S, Ott M, Auli M, et al. Understanding back-translation at scale. In: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Brussels, 2018. 489–500

Karakanta A, Dehdari J, van Genabith J. Neural machine translation for low-resource languages without parallel corpora. Machine Translation, 2018, 32: 167–189

Wang S, Liu Y, Wang C, et al. Improving back-translation with uncertainty-based confidence estimation. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing and the International Joint Conference on Natural Language Processing. Hong Kong, 2019. 791–802

Zhang J, Zong C. Exploiting source-side monolingual data in neural machine translation. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing. Austin, 2016. 1535–1545

Cheng Y, Xu W, He Z, et al. Semi-supervised learning for neural machine translation. In: Proceedings of the Annual Meeting of the Association for Computational Linguistics. Berlin, 2016. 1965–1974

He D, Xia Y, Qin T, et al. Dual learning for machine translation. In: Proceedings of the Conference on Neural Information Processing Systems, Barcelona, 2016. 820–828

Zhang Z, Liu S, Li M, et al. Joint training for neural machine translation models with monolingual data. In: Proceedings of the AAAI Conference on Artificial Intelligence. New Orleans, 2018. 555–562

Zheng Z, Zhou H, Huang S, et al. Mirror-generative neural machine translation. In: Proceedings of the International Conference on Learning Representations. Addis Ababa, 2020

Ravi S, Knight K. Deciphering foreign language. In: Proceedings of the Annual Meeting of the Association for Computational Linguistics: Human Language Technologies. Portland, 2011. 12–21

Dou Q, Knight K. Large scale decipherment for out-of-domain machine translation. In: Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning. Jeju Island, 2012. 266–275

Nuhn M, Mauser A, Ney H. Deciphering foreign language by combining language models and context vectors. In: Proceedings of the Annual Meeting of the Association for Computational Linguistics. Jeju Island, 2012. 156–164

Klementiev A, Irvine A, Callison-Burch C, et al. Toward statistical machine translation without parallel corpora. In: Proceedings of the Conference of the European Chapter of the Association for Computational Linguistics. Avignon, 2012. 130–140

Zhang J, Zong C. Learning a phrase-based translation model from monolingual data with application to domain adaptation. In: Proceedings of the Annual Meeting of the Association for Computational Linguistics. Sofia, 2013. 1425–1434

Mikolov T, Le Q V, Sutskever I. Exploiting similarities among languages for machine translation. ArXiv: 1309.4168

Faruqui M, Dyer C. Improving vector space word representations using multilingual correlation. In: Proceedings of the Conference of the European Chapter of the Association for Computational Linguistics. Gothenburg, 2014. 462–471

Zhang M, Liu Y, Luan H, et al. Adversarial training for unsupervised bilingual lexicon induction. In: Proceedings of the Annual Meeting of the Association for Computational Linguistics. Vancouver, 2017. 1959–1970

Zhang M, Liu Y, Luan H, et al. Earth movers distance minimization for unsupervised bilingual lexicon induction. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing. Copenhagen, 2017. 1934–1945

Artetxe M, Labaka G, Agirre E. Learning bilingual word embeddings with (almost) no bilingual data. In: Proceedings of the Annual Meeting of the Association for Computational Linguistics. Vancouver, 2017. 451–462

Conneau A, Lample G, Ranzato M, et al. Word translation without parallel data. In: Proceedings of the International Conference on Learning Representations. Vancouver, 2018

Cao H, Zhao T. Point set registration for unsupervised bilingual lexicon induction. In: Proceedings of the International Joint Conference on Artificial Intelligence. Stockholm, 2018. 3991–3997

Artetxe M, Labaka G, Agirre E. Unsupervised statistical machine translation. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing. Brussels, 2018. 3632–3642

Lample G, Conneau A, Denoyer L, et al. Unsupervised machine translation using monolingual corpora only. In: Proceedings of the International Conference on Learning Representations. Vancouver, 2018

Yang Z, Chen W, Wang F, et al. Unsupervised neural machine translation with weight sharing. In: Proceedings of the Annual Meeting of the Association for Computational Linguistics. Melbourne, 2018. 46–55

Artetxe M, Labaka G, Agirre E. An effective approach to unsupervised machine translation. In: Proceedings of the Annual Meeting of the Association for Computational Linguistics. Florence, 2019. 194–203

Conneau A, Lample G. Cross-lingual language model pretraining. In: Proceedings of the Conference on Neural Information Processing Systems. Vancouver, 2019. 7057–7067

Song K, Tan X, Qin T, et al. Mass: Masked sequence to sequence pre-training for language generation. In: Proceedings of the International Conference on Machine Learning. Long Beach, 2019. 5926–5936

Ren S, Wu Y, Liu S, et al. Explicit cross-lingual pre-training for un-supervised machine translation. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing and the International Joint Conference on Natural Language Processing. Hong Kong, 2019. 770–779

Devlin J, Chang M W, Lee K, et al. Bert: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Minneapolis, 2019. 4171–4186

Huang P Y, Liu F, Shiang S R, et al. Attention-based multimodal neural machine translation. In: Proceedings of the First Conference on Machine Translation: Shared Task Papers. Berlin, 2016. 639–645

Calixto I, Liu Q, Campbell N. Doubly-attentive decoder for multimodal neural machine translation. In: Proceedings of the Annual Meeting of the Association for Computational Linguistics. Vancouver, 2017. 1913–1924

Elliott D, Kádár A. Imagination improves multimodal translation. In: Proceedings of the International Joint Conference on Natural Language Processing. Taipei, 2017. 130–141

Calixto I, Rios M, Aziz W. Latent variable model for multi-modal translation. In: Proceedings of the Annual Meeting of the Association for Computational Linguistics. Florence, 2019. 6392–6405

Calixto I, Liu Q. An error analysis for image-based multi-modal neural machine translation. Machine Translation, 2019, 33: 155–177

Caglayan O, Madhyastha P, Specia L, et al. Probing the need for visual context in multimodal machine translation. In: Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Minneapolis, 2019. 4159–4170

Zong C, Wu H, Huang T, et al. Analysis on characteristics of chinese spoken language. In: Proceedings of the Natural Language Processing Pacific Rim Symposium. Beijing, 1999. 358–362

Bérard A, Pietquin O, Servan C, et al. Listen and translate: A proof of concept for end-to-end speech-to-text translation. In: Proceedings of the Conference on Neural Information Processing Systems Workshop on End-to-end Learning for Speech and Audio Processing. Barcelona, 2016. 1–5

Jan N, Cattoni R, Sebastian S, et al. The iwslt 2018 evaluation campaign. In: Proceedings of the International Workshop on Spoken Language Translation. Bruges, 2018. 2–6

Post M, Kumar G, Lopez A, et al. Improved speech-to-text translation with the Fisher and Callhome Spanish-English speech translation corpus. In: Proceedings of the International Workshop on Spoken Language Translation. Heidelberg, 2013. 295–301

Di Gangi M A, Cattoni R, Bentivogli L, et al. Must-c: A multilingual speech translation corpus. In: Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Minneapolis, 2019. 2012–2017

Liu Y, Zhang J, Xiong H, et al. Synchronous speech recognition and speech-to-text translation with interactive decoding. In: Proceedings of the AAAI Conference on Artificial Intelligence. New York, 2020. 8417–8424

Weiss R J, Chorowski J, Jaitly N, et al. Sequence-to-sequence models can directly translate foreign speech. In: Proceedings of INTERSPEECH. Stockholm, 2017. 2625–2629

Anastasopoulos A, Chiang D. Tied multitask learning for neural speech translation. In: Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. New Orleans, 2018. 82–91

Bérard A, Besacier L, Kocabiyikoglu A C, et al. End-to-end automatic speech translation of audiobooks. In: Proceedings of the International Conference on Acoustics, Speech, and Signal Processing. Alberta, 2018. 6224–6228

Jia Y, Johnson M, Macherey W, et al. Leveraging weakly supervised data to improve end-to-end speech-to-text translation. In: Proceedings of the International Conference on Acoustics, Speech, and Signal Processing. Brighton, 2019. 7180–7184

Liu Y, Xiong H, He Z, et al. End-to-end speech translation with knowledge distillation. In: Proceedings of INTERSPEECH. Graz, 2019. 1128–1132

Bansal S, Kamper H, Livescu K, et al. Pre-training on high-resource speech recognition improves low-resource speech-to-text translation. In: Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Minneapolis, 2019. 58–68

Wang C, Wu Y, Liu S, et al. Bridging the gap between pre-training and fine-tuning for end-to-end speech translation. In: Proceedings of the AAAI Conference on Artificial Intelligence. New York, 2020. 9161–9168

Graves A, Fernández S, Gomez F, et al. Connectionist temporal classification: Labelling unsegmented sequence data with recurrent neural networks. In: Proceedings of the International Conference on Machine Learning. Pittsburgh, 2006. 369–376

Niehues J, Nguyen T S, Cho E, et al. Dynamic transcription for low-latency speech translation. In: Proceedings of INTERSPEECH. 2016. 2513–2517

Niehues J, Pham N Q, Ha T L, et al. Low-latency neural speech translation. In: Proceedings of INTERSPEECH. Hyderabad, 2018. 1293–1297

Grissom II A, He H, Boyd-Graber J, et al. Dont until the final verb wait: Reinforcement learning for simultaneous machine translation. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing. Doha, 2014. 1342–1352

Satija H, Pineau J. Simultaneous machine translation using deep reinforcement learning. In: Proceedings of the International Conference on Machine Learning Workshop on Abstraction in Reinforcement Learning. New York, 2016

Gu J, Neubig G, Cho K, et al. Learning to translate in real-time with neural machine translation. In: Proceedings of the Conference of the European Chapter of the Association for Computational Linguistics. Valencia, 2017. 1053–1062

Alinejad A, Siahbani M, Sarkar A. Prediction improves simultaneous neural machine translation. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing. Brussels, 2018. 3022–3027

Dalvi F, Durrani N, Sajjad H, et al. Incremental decoding and training methods for simultaneous translation in neural machine translation. In: Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. New Orleans, 2018. 493–499

Ma M, Huang L, Xiong H, et al. Stacl: Simultaneous translation with implicit anticipation and controllable latency using prefix-to-prefix framework. In: Proceedings of the Annual Meeting of the Association for Computational Linguistics. Florence, 2019. 3025–3036

Arivazhagan N, Cherry C, Macherey W, et al. Monotonic infinite lookback attention for simultaneous machine translation. In: Proceedings of the Annual Meeting of the Association for Computational Linguistics. Florence, 2019. 1313–1323

Zheng B, Zheng R, Ma M, et al. Simpler and faster learning of adaptive policies for simultaneous translation. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing. Hong Kong, 2019. 1349–1354

Zheng B, Zheng R, Ma M, et al. Simultaneous translation with flexible policy via restricted imitation learning. In: Proceedings of the Annual Meeting of the Association for Computational Linguistics. Florence, 2019. 5816–5822

Arthur P, Cohn T, Haffari G. Learning coupled policies for simultaneous machine translation. ArXiv: 2002.04306

Popel M. Cuni transformer neural mt system for WMT 18. In: Proceedings of the Third Conference on Machine Translation. Belgium, 2018. 482–487

Bojar O, Federmann C, Fishel M, et al. Findings of the 2018 conference on machine translation (WMT18). In: Proceedings of the Third Conference on Machine Translation. Belgium, 2018. 272–303

Läubli S, Castilho S, Neubig G, et al. A set of recommendations for assessing human-machine parity in language translation. J Artif Intell Res, 2020, 67: 653–672

Läubli S, Sennrich R, Volk M. Has machine translation achieved human parity? A case for document-level evaluation. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing. Brussels, 2018. 4791–4796

Papineni K, Roukos S, Ward T, et al. Bleu: A method for automatic evaluation of machine translation. In: Proceedings of the Annual Meeting of the Association for Computational Linguistics. Philadelphia, 2002. 311–318

Müller M, Rios A, Voita E, et al. A large-scale test set for the evaluation of context-aware pronoun translation in neural machine translation. In: Proceedings of the Third Conference on Machine Translation. Brussels, 2018. 61–72

Gu J, Liu Q, Cho K. Insertion-based decoding with automatically inferred generation order. Trans Association Comput Linguistics, 2019, 7: 661–676

Stern M, Chan W, Kiros J, et al. Insertion transformer: Flexible sequence generation via insertion operations. ArXiv: 1902.03249

Emelianenko D, Voita E, Serdyukov P. Sequence modeling with unconstrained generation order. In: Proceedings of the Conference on Neural Information Processing Systems. Vancouver, 2019. 7698–7709

Ji B, Zhang Z, Duan X, et al. Cross-lingual pre-training based transfer for zero-shot neural machine translation. In: Proceedings of the AAAI Conference on Artificial Intelligence. New York, 2020. 115–122

Zhu J, Xia Y, Wu L, et al. Incorporating bert into neural machine translation. In: Proceedings of the International Conference on Learning Representations. Addis Ababa, 2020

Leng Y, Tan X, Qin T, et al. Unsupervised pivot translation for distant languages. In: Proceedings of the Annual Meeting of the Association for Computational Linguistics. Florence, 2019. 175–183

Zhang J, Liu Y, Luan H, et al. Prior knowledge integration for neural machine translation using posterior regularization. In: Proceedings of the Annual Meeting of the Association for Computational Linguistics. Vancouver, 2017. 1514–1523

Tu Z, Lu Z, Liu Y, et al. Modeling coverage for neural machine translation. In: Proceedings of the Annual Meeting of the Association for Computational Linguistics. Berlin, 2016. 76–85

Mi H, Sankaran B, Wang Z, et al. Coverage embedding models for neural machine translation. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing. Austin, 2016. 955–960

Feng Y, Zhang S, Zhang A, et al. Memory-augmented neural machine translation. In: Proceedings of the Annual Meeting of the Association for Computational Linguistics. Copenhagen, 2019. 1390–1399

Zhao Y, Wang Y, Zhang J, et al. Phrase table as recommendation memory for neural machine translation. In: Proceedings of the International Joint Conference on Artificial Intelligence. Stockholm, 2018. 4609–4615

Zhao Y, Zhang J, He Z, et al. Addressing troublesome words in neural machine translation. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing. Brussels, 2018. 391–400

Wang X, Tu Z, Xiong D, et al. Translating phrases in neural machine translation. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing. Copenhagen, 2017. 1421–1431

Wang X, Tu Z, Zhang M. Incorporating statistical machine translation word knowledge into neural machine translation. IEEE/ACM Trans Audio Speech Language Process, 2018, 26: 2255–2266

Lu Y, Zhang J, Zong C. Exploiting knowledge graph in neural machine translation. In: Proceedings of the China Workshop on Machine Translation. Fujian, 2018. 27–38

Luong M T, Manning C D. Stanford neural machine translation systems for spoken language domain. In: Proceedings of the International Workshop on Spoken Language Translation. Da Nang, 2015. 94–97

Zoph B, Yuret D, May J, et al. Transfer learning for low-resource neural machine translation. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing. Austin, 2016. 1568–1575

Wang R, Utiyama M, Liu L, et al. Instance weighting for neural machine translation domain adaptation. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing. Copenhagen, 2017. 1482–1488

Chu C, Dabre R, Kurohashi S. An empirical comparison of simple domain adaptation methods for neural machine translation. In: Proceedings of the Annual Meeting of the Association for Computational Linguistics. Vancouver, 2017. 385–391

Chu C, Wang R. A survey of domain adaptation for neural machine translation. In: Proceedings of the International Conference on Computational Linguistics. Santa Fe, 2018. 1304–1319

Li X, Zhang J, Zong C. One sentence one model for neural machine translation. In: Proceedings of the International Conference on Language Resources and Evaluation. Miyazaki, 2018. 910–917

Zhang X, Shapiro P, Kumar G, et al. Curriculum learning for domain adaptation in neural machine translation. In: Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Minneapolis, 2019. 1903–1915

Zeng J, Liu Y, Su J, et al. Iterative dual domain adaptation for neural machine translation. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing and the International Joint Conference on Natural Language Processing. Hong Kong, 2019. 845–855

Shen S, Cheng Y, He Z, et al. Minimum risk training for neural machine translation. In: Proceedings of the Annual Meeting of the Association for Computational Linguistics. Berlin, 2016. 1683–1692

Zhang W, Feng Y, Meng F, et al. Bridging the gap between training and inference for neural machine translation. In: Proceedings of the Annual Meeting of the Association for Computational Linguistics. Florence, 2019. 4334–4343

Ding Y, Liu Y, Luan H, et al. Visualizing and understanding neural machine translation. In: Proceedings of the Annual Meeting of the Association for Computational Linguistics. Vancouver, 2017. 1150–1159

Cheng Y, Tu Z, Meng F, et al. Towards robust neural machine translation. In: Proceedings of the Annual Meeting of the Association for Computational Linguistics. Melbourne, 2018. 1756–1766

Cheng Y, Jiang L, Macherey W. Robust neural machine translation with doubly adversarial inputs. In: Proceedings of the Annual Meeting of the Association for Computational Linguistics. Florence, 2019. 4324–4333

Download references

Author information

Authors and affiliations.

National Laboratory of Pattern Recognition, CASIA, Beijing, 100190, China

JiaJun Zhang & ChengQing Zong

University of Chinese Academy of Sciences, Beijing, 100190, China

CAS Center for Excellence in Brain Science and Intelligence Technology, Shanghai, 200031, China

ChengQing Zong

You can also search for this author in PubMed   Google Scholar

Corresponding authors

Correspondence to JiaJun Zhang or ChengQing Zong .

Additional information

This work was supported by the National Natural Science Foundation of China (Grant Nos. U1836221 and 61673380), and the Beijing Municipal Science and Technology Project (Grant No. Z181100008918017).

Rights and permissions

Reprints and permissions

About this article

Zhang, J., Zong, C. Neural machine translation: Challenges, progress and future. Sci. China Technol. Sci. 63 , 2028–2050 (2020). https://doi.org/10.1007/s11431-020-1632-x

Download citation

Received : 10 March 2020

Accepted : 09 May 2020

Published : 15 September 2020

Issue Date : October 2020

DOI : https://doi.org/10.1007/s11431-020-1632-x

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • neural machine translation
  • Transformer
  • multimodal translation
  • low-resource translation
  • document translation
  • Find a journal
  • Publish with us
  • Track your research

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings
  • My Bibliography
  • Collections
  • Citation manager

Save citation to file

Email citation, add to collections.

  • Create a new collection
  • Add to an existing collection

Add to My Bibliography

Your saved search, create a file for external citation management software, your rss feed.

  • Search in PubMed
  • Search in NLM Catalog
  • Add to Search

Scaling neural machine translation to 200 languages

Collaborators.

  • NLLB Team : Marta R Costa-Jussà ,  James Cross ,  Onur Çelebi ,  Maha Elbayad ,  Kenneth Heafield ,  Kevin Heffernan ,  Elahe Kalbassi ,  Janice Lam ,  Daniel Licht ,  Jean Maillard ,  Anna Sun ,  Skyler Wang ,  Guillaume Wenzek ,  Al Youngblood ,  Bapi Akula ,  Loic Barrault ,  Gabriel Mejia Gonzalez ,  Prangthip Hansanti ,  John Hoffman ,  Semarley Jarrett ,  Kaushik Ram Sadagopan ,  Dirk Rowe ,  Shannon Spruit ,  Chau Tran ,  Pierre Andrews ,  Necip Fazil Ayan ,  Shruti Bhosale ,  Sergey Edunov ,  Angela Fan ,  Cynthia Gao ,  Vedanuj Goswami ,  Francisco Guzmán ,  Philipp Koehn ,  Alexandre Mourachko ,  Christophe Ropers ,  Safiyyah Saleem ,  Holger Schwenk ,  Jeff Wang
  • PMID: 38839963
  • DOI: 10.1038/s41586-024-07335-x

The development of neural techniques has opened up new avenues for research in machine translation. Today, neural machine translation (NMT) systems can leverage highly multilingual capacities and even perform zero-shot translation, delivering promising results in terms of language coverage and quality. However, scaling quality NMT requires large volumes of parallel bilingual data, which are not equally available for the 7,000+ languages in the world 1 . Focusing on improving the translation qualities of a relatively small group of high-resource languages comes at the expense of directing research attention to low-resource languages, exacerbating digital inequities in the long run. To break this pattern, here we introduce No Language Left Behind-a single massively multilingual model that leverages transfer learning across languages. We developed a conditional computational model based on the Sparsely Gated Mixture of Experts architecture 2-7 , which we trained on data obtained with new mining techniques tailored for low-resource languages. Furthermore, we devised multiple architectural and training improvements to counteract overfitting while training on thousands of tasks. We evaluated the performance of our model over 40,000 translation directions using tools created specifically for this purpose-an automatic benchmark (FLORES-200), a human evaluation metric (XSTS) and a toxicity detector that covers every language in our model. Compared with the previous state-of-the-art models, our model achieves an average of 44% improvement in translation quality as measured by BLEU. By demonstrating how to scale NMT to 200 languages and making all contributions in this effort freely available for non-commercial use, our work lays important groundwork for the development of a universal translation system.

© 2024. Meta.

PubMed Disclaimer

  • Fan, A. et al. Beyond English-centric multilingual machine translation. J. Mach. Learn. Res 22, 1–48 (2021).
  • Du, N. et al. GlaM: efficient scaling of language models with mixture-of-experts. In Proceedings of the 39th International Conference on Machine Learning Vol. 162, 5547–5569 (PMLR, 2022).
  • Hwang, C. et al. Tutel: adaptive mixture-of-experts at scale. In 6th Conference on Machine Learning and Systems (MLSys, 2023).
  • Lepikhin, D. et al. GShard: scaling giant models with conditional computation and automatic sharding. In International Conference on Learning Representations (ICLR, 2021).
  • Lewis, M., Bhosale, S., Dettmers, T., Goyal, N. & Zettlemoyer, L. BASE layers: simplifying training of large, sparse models. In Proc. 38th International Conference on Machine Learning Vol. 139, 6265–6274 (PMLR, 2021).

Related information

Linkout - more resources, full text sources.

  • Nature Publishing Group

full text provider logo

  • Citation Manager

NCBI Literature Resources

MeSH PMC Bookshelf Disclaimer

The PubMed wordmark and PubMed logo are registered trademarks of the U.S. Department of Health and Human Services (HHS). Unauthorized use of these marks is strictly prohibited.

Help | Advanced Search

Computer Science > Computation and Language

Title: an overview on machine translation evaluation.

Abstract: Since the 1950s, machine translation (MT) has become one of the important tasks of AI and development, and has experienced several different periods and stages of development, including rule-based methods, statistical methods, and recently proposed neural network-based learning methods. Accompanying these staged leaps is the evaluation research and development of MT, especially the important role of evaluation methods in statistical translation and neural translation research. The evaluation task of MT is not only to evaluate the quality of machine translation, but also to give timely feedback to machine translation researchers on the problems existing in machine translation itself, how to improve and how to optimise. In some practical application fields, such as in the absence of reference translations, the quality estimation of machine translation plays an important role as an indicator to reveal the credibility of automatically translated target languages. This report mainly includes the following contents: a brief history of machine translation evaluation (MTE), the classification of research methods on MTE, and the the cutting-edge progress, including human evaluation, automatic evaluation, and evaluation of evaluation methods (meta-evaluation). Manual evaluation and automatic evaluation include reference-translation based and reference-translation independent participation; automatic evaluation methods include traditional n-gram string matching, models applying syntax and semantics, and deep learning models; evaluation of evaluation methods includes estimating the credibility of human evaluations, the reliability of the automatic evaluation, the reliability of the test set, etc. Advances in cutting-edge evaluation methods include task-based evaluation, using pre-trained language models based on big data, and lightweight optimisation models using distillation techniques.
Comments: 35 pages, in Chinese
Subjects: Computation and Language (cs.CL); Artificial Intelligence (cs.AI)
Cite as: [cs.CL]
  (or [cs.CL] for this version)
  Focus to learn more arXiv-issued DOI via DataCite

Submission history

Access paper:.

  • Other Formats

license icon

References & Citations

  • Google Scholar
  • Semantic Scholar

BibTeX formatted citation

BibSonomy logo

Bibliographic and Citation Tools

Code, data and media associated with this article, recommenders and search tools.

  • Institution

arXivLabs: experimental projects with community collaborators

arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website.

Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them.

Have an idea for a project that will add value for arXiv's community? Learn more about arXivLabs .

Machine Translation

Machine Translation is one of the most important applications of Natural Language Processing. Formed in 2002, the Machine Translation Group, part of the broader Speech and Language group at Microsoft AI and Research, focuses on eliminating language barriers and enabling global communication for written and spoken languages. Our team of research scientists and engineers are working on breakthroughs in deep learning, along with scalable and high performance systems, to deliver the best translation quality to Microsoft and our customers. Our team is working on innovations that are used by millions of users on daily basis, and to deliver on the fabled “universal translator” showed – so far – only in sci-fi movies.

Our work powers Microsoft Translator API , which has been used by Microsoft products since 2006, and has been available as an API for customers since 2011. It’s now used extensively within familiar Microsoft products such as Bing, Cortana, Microsoft Edge, Office, SharePoint, Skype, Yammer, and Microsoft Translator Apps.

Check out these videos to see how the Microsoft Translator API live feature powers real-time multi-device translations among staff and parents of the Chinook Middle School in the Seattle area.

  • Follow on Twitter
  • Like on Facebook
  • Follow on LinkedIn
  • Subscribe on Youtube
  • Follow on Instagram
  • Subscribe to our RSS feed

Share this page:

  • Share on Twitter
  • Share on Facebook
  • Share on LinkedIn
  • Share on Reddit

Logo

The Evolution of Translation Technology: From CAT Tools to AI

The Evolution of Translation Technology: From CAT Tools to AI

The world of translation has witnessed a remarkable transformation over the past few decades, thanks to the relentless march of technology. This article delves into the evolution of translation technology, from the humble beginnings of Computer-Assisted Translation (CAT) tools to the groundbreaking impact of Artificial Intelligence (AI) in the industry. We will explore how CAT tools laid the foundation for automation in translation and then delve into the revolution by AI-driven solutions, which have enhanced efficiency and reshaped the entire landscape of language translation.

Language is a fundamental bridge that connects individuals, communities, and nations. In a globalized world, the demand for translation services has surged, driven by the need for cross-cultural communication, international business, and information exchange. As technology has advanced, so too has the field of translation.

The Era of Human Translation

Before delving into the world of CAT tools and AI, it’s essential to understand the roots of translation . For centuries, human translators were the sole means of bridging language gaps. The process was painstaking, labor-intensive, and often time-consuming. It relied heavily on linguistic expertise, cultural knowledge, and extensive reference materials.

Emergence of Computer-Assisted Translation (CAT) Tools

The advent of computers brought a transformative shift in the translation process. The first CAT tools emerged in the 1960s and 70s, simplifying tasks for human translators. These tools incorporated databases of previously translated texts, allowing translators to access and reuse segments of text. Though revolutionary at the time, CAT tools were limited in their capabilities, primarily assisting with terminology consistency and fragmentary reuse.

The Rise of Translation Memory

One of the key innovations within CAT tools was the development of Translation Memory (TM) systems. TM systems stored pairs of source and target language segments, allowing translators to reuse translations of similar segments in future projects. This not only improved consistency but also reduced the time required for translation, leading to increased efficiency.

Machine Translation (MT) Enters the Scene

While CAT tools were a significant leap forward, the next milestone in the translation technology journey was the emergence of Machine Translation (MT). MT systems, such as Google Translate and early rule-based systems, used algorithms to generate translations automatically. However, the quality of these translations often left much to be desired, and they were generally considered unsuitable for professional use.

The AI Revolution in Translation

The real game-changer for the translation industry was the infusion of Artificial Intelligence. Machine Learning (ML) and Natural Language Processing (NLP) algorithms allowed AI-driven translation systems to understand context, idioms, and nuances in language. Neural Machine Translation (NMT) models, like Google’s Transformer, improved translation quality significantly.

Neural Machine Translation (NMT)

NMT marked a breakthrough in machine translation. Unlike previous rule-based or statistical MT approaches, NMT leveraged deep learning techniques to process entire sentences rather than just fragments. This approach enabled more fluent and context-aware translations, making it suitable for professional and even literary translation tasks.

The Role of Big Data

AI-driven translation systems thrive on vast amounts of data. With the growth of the internet, parallel corpora of text in multiple languages became readily available. This enabled AI models to be trained on extensive datasets, making them more accurate and adaptable to a wide range of topics and languages.

The Power of Customization

AI translation systems also introduced the concept of customization. Organizations could fine-tune AI models to suit their specific needs, ensuring that translations were not only accurate but also aligned with their brand’s tone and style.

Real-Time Translation

The integration of AI-driven translation into various digital platforms has brought about real-time translation capabilities. This means that individuals and businesses can communicate seamlessly across language barriers, revolutionizing global business, diplomacy, and international cooperation.

The Human-AI Collaboration

While AI has brought incredible advancements to the translation industry, human expertise remains indispensable. The ideal translation process often involves a collaboration between human translators and AI systems, combining linguistic knowledge with the speed and efficiency of AI.

Final Thoughts on the Evolution of Translation Technology

The evolution of translation technology from CAT tools to AI has been nothing short of extraordinary. What began as a labor-intensive manual process has evolved into a dynamic blend of human intelligence and artificial precision. The synergy between human translators and AI-driven systems has not only revolutionized efficiency but has also opened new horizons for cross-cultural communication, global commerce, and international understanding. As AI continues to advance, the future of translation technology promises even greater breakthroughs, further breaking down the barriers between languages and cultures.

Seldean Smith

Seldean Smith

Seldean is a multi-skilled content wizard that loves digging into all things language, culture, and localization.

  • Follow the Author

No Comments

Sorry, the comment form is closed at this time.

Language Preservation and Revitalization: The Role of Translation Services

  • Next Articles

AI in Language Services: A Valuable Partner for Human Professionals

Download our Free E-book and Learn about our Full Website Localization Process

Website Localization Process

Ready to Enhance Your Global Reach?

Need help with a large project, be the first to know when we have awesome discounts join our newsletter..

Plus, you’ll get exclusive tips, specific to your industry. We can't wait to connect!

As Featured In

Test

CSA Research

Access Research

Access Business Analytics

  • Business Analytics
  • Advisory and Consulting
  • RFP Advisory Services
  • Partnerships
  • Train My Team
  • Grow the Business
  • Understand Buyer Needs
  • Planning and Budgeting During Uncertainty
  • Build a Business Case
  • Plan International Strategy
  • Benchmark Business Functions
  • Digital Growth Series
  • Top LSP Rankings Dashboard
  • TOP 100 LSPs 2023
  • Rankings of the Largest LSPs by Region: 2023
  • Promoting Your 2023 Ranking
  • Sizing the Language Services and Technology Market 2024
  • Industry Data and Resources
  • CEOs reflect on the forces that will shape the future
  • CEOs Reflect on GenAI (July 2023)
  • CEOs reflect on 2023
  • 19 CEOs Insights at fastest-growing LSPs 2022
  • 20 CEOs Reflect on 2022 (July 2022)
  • 18 CEOs Reflect on 2021- 2022
  • 14 CEOs Reflect on 2021 (July 2021)
  • 14 CEOs Reflect on 2021 (January 2021)
  • 14 CEOs Reflect on 2020
  • Translation Pricing Research
  • Machine Translation
  • Gender and Family Survey (2023)
  • Research for Marketers
  • Generative AI
  • CEOs Reflect on 2023
  • 19 CEOs insights at fastest growing LSPs
  • 20 CEOs Reflect on 2022
  • CRWB Series
  • Webinars and Events
  • Speaking Engagements
  • Media Citations
  • Press Releases
  • Interact with Us
  • The CSA Story
  • The CSA Difference
  • Leadership Team
  • The Research Team

arrow right

The rise of machine translation is a critical component of meeting growing demands for language services. How will you determine your approach to MT’s capabilities and technology and prepare for how it will affect your business plans and content strategy?

CSA Research’s thought leadership and expertise in this field identifies relevant trends early on. Our data-driven research provides comprehensive advice for making smart decisions at the right time.

  • In 2012, CSA Research introduced the concept of Human-Enhanced MT.
  • In 2016, CSA Research coined the phrase Augmented Translation.
  • In 2021, CSA Research predicted the rise of metadata-driven responsive MT and responsible MT.

group 5912

MT is Constantly Changing – Don’t Get Left Behind

MT is an artificial intelligence-based technology; it is a universal mainstream solution that even smaller businesses can implement.

MT plays a vital role in enabling international commerce and individuals to access the critical content in the languages they need.

Stay ahead by accessing the latest independent and objective research by expert thought leaders in the field and then create a plan of action.

Uncover the wealth and depth of our comprehensive research reports 

Read our blog: MT as a platform

Why CSA Research?

CSA Research has long been at the forefront of predicting crucial early developments in MT and helping organizations gain a sustainable competitive advantage in their global markets. Our first report on the subject was in 2013: Data Leakage from Free Machine Translation.

This complex topic requires continuous in-depth research by all major user groups, from developers to LSPs to enterprise and non-profit implementers, human linguists, and global consumers who want to gain fact-based outcomes in this field.

  • See the importance of MT in reaching global consumers.
  • See how global enterprises are meeting content demands (GCXC).
  • See how many surveys/interactions CSA Research carried out.

Read our blog: Curves Ahead: MT and the Future of Language Technology

research on machine translation

The Next Generation of Machine Translation

CSA Research has defined the next generation of machine translation and given it a name: Responsive Machine Translation will deliver better results tailored to specific situations that are more relevant and adapt automatically to the situations in which they're generated.

Looking for an introduction to Machine Translation? Explore our TechStack series.

Already an expert? Explore CSA Research’s thought leadership in emerging fields

Why MT Matters More Than Ever

Machine technology is on the cusp of a revolution. Don’t run the risk of being left behind. Find out how your company can keep up to date with the latest MT data-driven trends and recommendations by CSA Research.

Get Started Now

research on machine translation

For Global Enterprises

Global enterprises need to provide a top-notch customer experience for clients regardless of where they come from, which means you need to speak to them in their language. Machine translation provides a cost-effective means to provide your clients with non-crucial information they can use, even if it isn't perfect.

In an era of falling translation prices and increasing customer demands machine translation is the way to stay ahead of your competitors.  LSPs that already have machine translation enjoy an advantage in the marketplace.

research on machine translation

A Wealth and Depth of MT Research

Cutting through the hype of mt.

Rapidly increasing content volumes does not mean that MT is displacing human translators despite mainstream business and technology outlets continuing to incredulously repeat press release-driven claims that machine translation will soon replace human translators.

MT is not displacing human translators. The reality is far more complex.

A more realistic scenario is one where AI-driven technologies combine human and machine capabilities to enhance the effectiveness and efficiency of human translators, a paradigm CSA Research has termed “augmented translation.”

At CSA Research, we cut through the hype, discern what matters, avoid drive-by articles summarizing the past, and deliver trusted data, insights, tools, and advice in this rapidly moving technology.

Uncover the wealth and depth of our 10 comprehensive research reports in 2022  

   MT Can Increase Your Global Footprint

As organizations grapple with meeting global customer expectations for seamless experiences that are adapted to their language needs when does integrating MT into their business strategy make sense? Let’s explore some use cases where adopting MT will increase your global footprint.

  • Is your customer experience falling short because you can't deliver content in all the languages you'd like? 
  • Do you find translation, in general, too expensive? 
  • Do you deal with large amounts of content of uncertain value that may need to be translated?
  • Do you deal with user-generated content that customers may wish to see in another language?

research on machine translation

Related Research

These reports can also help you on your growth journey. if you're already a client, read them today , not a client yet get in touch., responsive machine translation.

This report presents an overview of this developing technology and explains how enterprises, language service providers, and MT developers alike can prepare for MT revolution.

Augmented Translation

This report highlights Augmented translation as a new approach to combining the strengths of humans and machines to address growing needs for multilingual content.  

How Do LSPs Use MT?

This report helps LSPs and buyers of language services understand how LSPs leverage this technology and refine their strategy around its use.

How Do Freelancers Use MT?

This report helps LSPs and buyers of language services understand how linguists leverage this technology and refine their processes.

What Do MT Adopters Want?

This report focuses on areas where MT developers can improve their offerings, and LSPs and enterprises can effectively manage challenges across the supply chain, with an emphasis on both technology and user interface issues, as well as how companies that build and deploy present their technology and its potential.

11 Must-Know Facts about Machine Translation

CSA Research’s data science and analyst teams combed through thousands of datapoints to synthesize a list of 11 must-know facts and trends to give you a comprehensive view of the practices, pain points, advances, and outcomes that matter in this field.

Understanding Augmented Translation

This presentation provides an overview of the technologies involved, their current status, and how the language industry is changing as it draws closer to achieving the vision of machine-empowered humans linguists.br />  

Has Machine Translation Reached Human Parity?

This report examines the basis for statements concerning “human parity” and discusses why they fall short and why they also fail to give proper credit to recent developments in NMT.  

research on machine translation

Zero-Shot Translation with Google’s Multilingual Neural Machine Translation System

November 22, 2016

Posted by Mike Schuster (Google Brain Team), Melvin Johnson (Google Translate) and Nikhil Thorat (Google Brain Team)

research on machine translation

  • Machine Intelligence
  • Machine Translation

Other posts of interest

research on machine translation

May 30, 2024

  • Conferences & Events ·
  • Machine Intelligence ·
  • Natural Language Processing

research on machine translation

May 29, 2024

  • Human-Computer Interaction and Visualization ·

research on machine translation

May 28, 2024

Information

  • Author Services

Initiatives

You are accessing a machine-readable page. In order to be human-readable, please install an RSS reader.

All articles published by MDPI are made immediately available worldwide under an open access license. No special permission is required to reuse all or part of the article published by MDPI, including figures and tables. For articles published under an open access Creative Common CC BY license, any part of the article may be reused without permission provided that the original article is clearly cited. For more information, please refer to https://www.mdpi.com/openaccess .

Feature papers represent the most advanced research with significant potential for high impact in the field. A Feature Paper should be a substantial original Article that involves several techniques or approaches, provides an outlook for future research directions and describes possible research applications.

Feature papers are submitted upon individual invitation or recommendation by the scientific editors and must receive positive feedback from the reviewers.

Editor’s Choice articles are based on recommendations by the scientific editors of MDPI journals from around the world. Editors select a small number of articles recently published in the journal that they believe will be particularly interesting to readers, or important in the respective research area. The aim is to provide a snapshot of some of the most exciting work published in the various research areas of the journal.

Original Submission Date Received: .

  • Active Journals
  • Find a Journal
  • Proceedings Series
  • For Authors
  • For Reviewers
  • For Editors
  • For Librarians
  • For Publishers
  • For Societies
  • For Conference Organizers
  • Open Access Policy
  • Institutional Open Access Program
  • Special Issues Guidelines
  • Editorial Process
  • Research and Publication Ethics
  • Article Processing Charges
  • Testimonials
  • Preprints.org
  • SciProfiles
  • Encyclopedia

life-logo

Article Menu

research on machine translation

  • Subscribe SciFeed
  • Recommended Articles
  • Author Biographies
  • Google Scholar
  • on Google Scholar

Find support for a specific problem in the support section of our website.

Please let us know what you think of our products and services.

Visit our dedicated information section to learn more about MDPI.

Article Versions Notes

Action Date Notes Link
article xml file uploaded 7 June 2024 10:06 CEST Original file -
article xml uploaded. 7 June 2024 10:06 CEST Update
article pdf uploaded. 7 June 2024 10:06 CEST Version of Record
article html file updated 7 June 2024 10:07 CEST Original file

Shigapova, R.R.; Mukhamedshina, Y.O. Electrophysiology Methods for Assessing of Neurodegenerative and Post-Traumatic Processes as Applied to Translational Research. Life 2024 , 14 , 737. https://doi.org/10.3390/life14060737

Shigapova RR, Mukhamedshina YO. Electrophysiology Methods for Assessing of Neurodegenerative and Post-Traumatic Processes as Applied to Translational Research. Life . 2024; 14(6):737. https://doi.org/10.3390/life14060737

Shigapova, Rezeda Ramilovna, and Yana Olegovna Mukhamedshina. 2024. "Electrophysiology Methods for Assessing of Neurodegenerative and Post-Traumatic Processes as Applied to Translational Research" Life 14, no. 6: 737. https://doi.org/10.3390/life14060737

Article Metrics

Article access statistics, further information, mdpi initiatives, follow mdpi.

MDPI

Subscribe to receive issue release notifications and newsletters from MDPI journals

IMAGES

  1. machine translation approaches

    research on machine translation

  2. Approaches for machine translation

    research on machine translation

  3. Machine translation approaches [1].

    research on machine translation

  4. Research on Machine Translation Technology in English Translation

    research on machine translation

  5. PPT

    research on machine translation

  6. Machine Translation vs. Human Translation: Will Artificial Intelligence

    research on machine translation

VIDEO

  1. Machine Translation

  2. History of AI in Translation

  3. Group 1-The Impact of Using Machine Translation on EFL Student’s Writing

  4. The challenges of machine translation

  5. Customised Machine Translation Solution for Scale

  6. Quantization on a machine translation model

COMMENTS

  1. Transforming machine translation: a deep learning system ...

    Third Conference on Machine Translation (WMT) 482-487 (Association for Computational Linguistics, 2019). ... Ludwig Cancer Research Oxford, University of Oxford, Oxford, OX1 2JD, UK.

  2. Meta's AI translation model embraces overlooked languages

    Read the paper: Scaling neural machine translation to 200 languages. Research into machine translation was instrumental in enabling some of the advances 2 - 4 that led to the development of ...

  3. Progress in Machine Translation

    1. A brief history of machine translation (MT) MT is the study of how to use computers to translate from one language into another. The concept of MT was first put forward by Warren Weaver in 1947 [1], just one year after the first computer, electronic numerical integrator and computer, was developed.From then on, MT has been considered to be one of the most challenging tasks in the field of ...

  4. A scientometric study of three decades of machine translation research

    This study aims to examine machine translation research in journals indexed in the Web of Science to find out the research trending issue, hotspot areas of research, and document co-citation analysis. To this end, 541 documents published between 1992 and 2022 were retrieved and analyzed using CiteSpace, and Bibexcel. Many metrics were analyzed ...

  5. Exploring Massively Multilingual, Massive Neural Machine Translation

    Multilingual machine translation processes multiple languages using a single translation model. The success of multilingual training for data-scarce languages has been demonstrated for automatic speech recognition and text-to-speech systems, and by prior research on multilingual translation [ 1, 2, 3 ]. We previously studied the effect of ...

  6. Neural machine translation: A review of methods, resources, and tools

    Neural machine translation has become the dominant approach to machine translation in both research and practice. This article reviewed the widely used methods in NMT, including modeling, decoding, data augmentation, interpretation, as well as evaluation. We then summarize the resources and tools that are useful for NMT research.

  7. PDF Scientific Credibility of Machine Translation Research: A Meta

    Abstract. This paper presents the first large-scale meta-evaluation of machine translation (MT). We annotated MT evaluations conducted in 769 research papers published from 2010 to 2020. Our study shows that practices for automatic MT evaluation have dramatically changed dur-ing the past decade and follow concerning trends.

  8. Machine Translation

    Machine Translation is an excellent example of how cutting-edge research and world-class infrastructure come together at Google. We focus our research efforts on developing statistical translation techniques that improve with more data and generalize well to new languages. Our large scale computing infrastructure allows us to rapidly experiment ...

  9. Scientific Credibility of Machine Translation Research: A Meta

    %0 Conference Proceedings %T Scientific Credibility of Machine Translation Research: A Meta-Evaluation of 769 Papers %A Marie, Benjamin %A Fujita, Atsushi %A Rubino, Raphael %Y Zong, Chengqing %Y Xia, Fei %Y Li, Wenjie %Y Navigli, Roberto %S Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language ...

  10. A Neural Network for Machine Translation, at Production Scale

    Today we announce the Google Neural Machine Translation system (GNMT), which utilizes state-of-the-art training techniques to achieve the largest improvements to date for machine translation quality. Our full research results are described in a new technical report we are releasing today: "Google's Neural Machine Translation System ...

  11. Neural machine translation: Challenges, progress and future

    Machine translation (MT) is a technique that leverages computers to translate human languages automatically. Nowadays, neural machine translation (NMT) which models direct mapping between source and target languages with deep neural networks has achieved a big breakthrough in translation performance and become the de facto paradigm of MT. This article makes a review of NMT framework, discusses ...

  12. Understanding the societal impacts of machine translation: a critical

    Analytical approach. Our method for analysing the content is informed by MT research and by healthcare and legal (public service) interpreting research in translation studies, which is currently largely concerned with human-based services (e.g., Hsieh, Citation 2016).MT research in translation studies is shedding light on multiple aspects of the technology, including its impact on human ...

  13. Scaling neural machine translation to 200 languages

    By demonstrating how to scale NMT to 200 languages and making all contributions in this effort freely available for non-commercial use, our work lays important groundwork for the development of a universal translation system. The development of neural techniques has opened up new avenues for research in machine translation. Today, neural ...

  14. LexMatcher: Dictionary-centric Data Collection for LLM-based Machine

    The fine-tuning of open-source large language models (LLMs) for machine translation has recently received considerable attention, marking a shift towards data-centric research from traditional neural machine translation. However, the area of data collection for instruction fine-tuning in machine translation remains relatively underexplored. In this paper, we present LexMatcher, a simple yet ...

  15. [2202.11027] An Overview on Machine Translation Evaluation

    Since the 1950s, machine translation (MT) has become one of the important tasks of AI and development, and has experienced several different periods and stages of development, including rule-based methods, statistical methods, and recently proposed neural network-based learning methods. Accompanying these staged leaps is the evaluation research ...

  16. Machine Translation and Global Research: Towards Improved Machine

    With the improvement of the quality of machine translation (MT for short), the use of machine translation has become ubiquitous in real life. Within translation studies, MT has long been viewed as an aid to translation per se and the focus of MT research has been the development of translation system and the improvement of translation quality. . Information literacy which entails the use of MT ...

  17. Machine Translation: History, Development, and Limitations

    Abstract. Machine translation (MT) is a term used to describe a range of computer-based activities involving translation. This article reviews sixty years of history of MT research and development, concentrating on the essential difficulties and limitations of the task, and how the various approaches have attempted to solve, or more usually work round, these.

  18. Google's Neural Machine Translation System: Bridging ...

    Abstract. Neural Machine Translation (NMT) is an end-to-end learning approach for automated translation, with the potential to overcome many of the weaknesses of conventional phrase-based translation systems. Unfortunately, NMT systems are known to be computationally expensive both in training and in translation inference.

  19. Machine Translation

    Abstract. Machine Translation (MT) is and always has been a core application in the field of natural-language processing. It is a very active research area and it has been attracting significant commercial interest, most of which has been driven by the deployment of corpus-based, statistical approaches, which can be built in a much shorter time and at a fraction of the cost of traditional ...

  20. Machine Translation

    Machine Translation. Machine Translation is one of the most important applications of Natural Language Processing. Formed in 2002, the Machine Translation Group, part of the broader Speech and Language group at Microsoft AI and Research, focuses on eliminating language barriers and enabling global communication for written and spoken languages.

  21. (PDF) The Machine Translation of Literature: Implications for

    Automatic translations generated by the two machine translation systems were compared to human made Arabic translations with the purpose of identifying the problems within these translations.

  22. (PDF) Machine Translation

    field of Artificial Intelligence. Machine translation is computer. program which is design to translate text from one language. (source language) to another language (target language) with-. out ...

  23. (PDF) Machine translation using natural language processing

    Machine Translation is the translation of text or speech by a computer with no human involvement. It is a popular topic in research with different methods being created, like rule-based ...

  24. Neural Machine Translation: A Review

    Abstract. The field of machine translation (MT), the automatic translation of written text from one natural language into another, has experienced a major paradigm shift in recent years. Statistical MT, which mainly relies on various count-based models and which used to dominate MT research for decades, has largely been superseded by neural ...

  25. The Evolution of Translation Technology: From CAT Tools to AI

    Machine Translation (MT) Enters the Scene. While CAT tools were a significant leap forward, the next milestone in the translation technology journey was the emergence of Machine Translation (MT). MT systems, such as Google Translate and early rule-based systems, used algorithms to generate translations automatically. However, the quality of ...

  26. Machine Translation

    Machine translation provides a cost-effective means to provide your clients with non-crucial information they can use, even if it isn't perfect. For LSPs In an era of falling translation prices and increasing customer demands machine translation is the way to stay ahead of your competitors.

  27. Zero-Shot Translation with Google's Multilingual Neural Machine

    In " Google's Multilingual Neural Machine Translation System: Enabling Zero-Shot Translation ", we address this challenge by extending our previous GNMT system, allowing for a single system to translate between multiple languages. Our proposed architecture requires no change in the base GNMT system, but instead uses an additional "token ...

  28. Life

    Editors select a small number of articles recently published in the journal that they believe will be particularly interesting to readers, or important in the respective research area. The aim is to provide a snapshot of some of the most exciting work published in the various research areas of the journal.