pctechguide.com

  • Home
  • Guides
  • Tutorials
  • Articles
  • Reviews
  • Glossary
  • Contact

How neural machine translation systems based on AI work

Just a few weeks ago, Meta presented an artificial intelligence model capable of translating into 200 languages. The bet on this technology, which has the name ‘No Language Left Behind’ (NLLB-200), is part of a project developed by Mark Zuckerberg’s company to boost its bet on the metaverse.

Almost all the technology giants, with the exception of Apple and Google, are undertaking projects to position themselves in this new virtual universe that is in full swing. But there are other, more modest companies, some of them local, that have long since initiated research efforts in this field. For some years now, the company Incyta and the GRIAL research group of the Arts and Humanities Department of the Universitat Oberta de Catalunya (UOC) have been collaborating on a series of research and technology transfer projects related to neural machine translation. The objective of the research is to develop neural machine translation systems to be integrated into the workflow of the company Incyta.

This Barcelona-based language services company has been using machine translation systems for years to carry out post-editing. This workflow based on machine translation plus post-editing makes it possible to offer a more efficient and economical translation service, while maintaining the same level of quality, to its wide range of clients: the written press, publishing houses, public administration, universities, etc.

Until a few years ago, machine translation systems offered sufficient quality only for similar language pairs, such as Spanish-Catalan or Spanish-French. On the other hand, for slightly more distant languages, such as Spanish-English, for example, the quality of machine translation was not sufficient. It was more efficient to translate the document manually from scratch.

The emergence of today’s neural machine translation systems has made it possible to obtain outstanding quality even for very distant language pairs, such as Chinese-Spanish, for example. The appearance of these systems has constituted a true revolution in the world of professional translation, since they open the door to applying the most post-edition machine translation flow to most translation jobs.

Rule-based machine translation and corpus-based machine translation

But to understand what this technological revolution is all about, it is worth remembering the two main paradigms of machine translation: rule-based machine translation and corpus-based machine translation. In the first paradigm, the rule-based paradigm, machine translation systems are developed by computer engineers and linguists who write programs, dictionaries and rules to translate a sentence in a source language into a sentence in the target language.

The development of these systems usually involves many months of work by teams of several people. Among the rule-based systems, syntactic transfer systems can be highlighted. In these systems, the sentence in the source language is syntactically parsed to automatically obtain a parse tree. This parse tree, which can be deep or shallow, is transferred to an equivalent tree in the target language using a set of rules.

Once this syntactic tree is obtained in the target language, the words are translated using bilingual dictionaries and the translated words are inflected to obtain a correct sentence in the target language. This paradigm has worked very well for similar languages that have quite similar syntactic structures. There are excellent systems using this methodology that are still in use for similar language pairs such as Spanish and Catalan.

In the second paradigm, corpus-based systems, systems are not developed, but trained. That is, the systems learn to translate from texts in the source language and in the target language. Parallel corpora, i.e., sets of segments or sentences in one language with their translation equivalents in another language, are normally used to train these systems.

Chronology of machine translation


The first corpus-based systems are statistical machine translation systems, which burst onto the market around 2005. These systems are based on the calculation of two probabilities: the probability that a given sentence in the target language is the translation of a sentence in the source language; and the probability that a given sentence in the target language is a correct sentence in that language. The first probability can be calculated from the statistics obtained from the parallel corpus; while the second probability is calculated from the statistics obtained in a monolingual corpus of the target language. This monolingual corpus can be obtained from the target language part of the parallel corpus.

Filed Under: Articles

Latest Articles

ATX form factor

The Intel Advanced/ML motherboard, launched in 1996, was designed to solve issues of space and airflow that the Pentium II and AGP graphics cards had caused the preceding LPX form factor. As the first major innovation in form factors in years, it marked the beginning of a new era in … [Read More...]

Digital Graphic Cards

When LCD panels first emerged, they connected to a graphics card via its VGA connector. This, of course, required that the graphics card first to convert the signal to analogue via its RAMDAC. Since LCDs are - unlike CRT … [Read More...]

DirectMusic Technology

The establishment of the MIDI protocol in 1982 enabled independent composers to effectively control the entire recording process from home-based studios and contributed significantly to the sound of the 1980s - from sequenced industrial mixes … [Read More...]

Gaming Laptop Security Guide: Protecting Your High-End Hardware Investment in 2025

Since Jacob took over PC Tech Guide, we’ve looked at how tech intersects with personal well-being and digital safety. Gaming laptops are now … [Read More...]

20 Cool Creative Commons Photographs About the Future of AI

AI technology is starting to have a huge impact on our lives. The market value for AI is estimated to have been worth $279.22 billion in 2024 and it … [Read More...]

13 Impressive Stats on the Future of AI

AI technology is starting to become much more important in our everyday lives. Many businesses are using it as well. While he has created a lot of … [Read More...]

Graphic Designers on Reddit Share their Views of AI

There are clearly a lot of positive things about AI. However, it is not a good thing for everyone. One of the things that many people are worried … [Read More...]

Redditors Talk About the Impact of AI on Freelance Writers

AI technology has had a huge impact on our lives. A 2023 survey by Pew Research found that 56% of people use AI at least once a day or once a week. … [Read More...]

11 Most Popular Books on Perl Programming

Perl is not the most popular programming language. It has only one million users, compared to 12 million that use Python. However, it has a lot of … [Read More...]

Guides

  • Computer Communications
  • Mobile Computing
  • PC Components
  • PC Data Storage
  • PC Input-Output
  • PC Multimedia
  • Processors (CPUs)

Recent Posts

How to Prevent a Virus Infection

Anybody who has ever been the victim of a computer virus knows that it affects more than your sanity; in some cases, a virus can ruin your … [Read More...]

Keep your Computer free from Viruses and More

Computer technology is advancing at a blistering pace, with each new year bringing stunning leaps and bounds over the previous year's already … [Read More...]

HAD Flat Panels

All of the displays discussed hitherto - whether they're made of liquid crystal, phosphors or plastic - have one thing in … [Read More...]

[footer_backtotop]

Copyright © 2025 About | Privacy | Contact Information | Wrtie For Us | Disclaimer | Copyright License | Authors