2017 DeepLearningTakesonTranslation

From GM-RKB
Jump to navigation Jump to search

Subject Headings: Deep Learning-based NL Translation Algorithm.

Notes

Cited By

Quotes

Abstract

Improvements in hardware, the availability of massive amounts of data, and algorithmic upgrades are among the factors supporting better machine translation.

Body

Turning to Translation

Most implementations of translation employ two neural networks. The first, called the encoder, processes input text from one language to create an evolving fixed-length vector representation of the evolving input. A second "decoder" network monitors this vector to produce text in a different language. Typically, the encoder and decoder are trained as a pair for each choice of source and target language. …

An additional critical element is the use of "attention," which Cho said was "motivated from human translation." As translation proceeds, based on what has been translated so far, this attention mechanism selects the most useful part of the text to translate next.

Attention models "really made a big difference," said LeCun. "That's what everybody is using right now."

= A Universal Language?

The separation of the encoder for one language from the decoder for another language raises an intriguing question about the vector that passes information between the two. …

References


;

 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2017 DeepLearningTakesonTranslationDon MonroeDeep Learning Takes on Translation10.1145/30772292017